Dec 01 19:56:17 crc systemd[1]: Starting Kubernetes Kubelet... Dec 01 19:56:17 crc restorecon[4763]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:17 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 19:56:18 crc restorecon[4763]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 19:56:18 crc restorecon[4763]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 01 19:56:18 crc kubenswrapper[4802]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 19:56:18 crc kubenswrapper[4802]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 01 19:56:18 crc kubenswrapper[4802]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 19:56:18 crc kubenswrapper[4802]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 19:56:18 crc kubenswrapper[4802]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 01 19:56:18 crc kubenswrapper[4802]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.558582 4802 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561184 4802 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561216 4802 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561224 4802 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561229 4802 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561240 4802 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561244 4802 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561248 4802 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561252 4802 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561255 4802 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561259 4802 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561262 4802 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561266 4802 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561270 4802 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561273 4802 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561277 4802 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561281 4802 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561285 4802 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561290 4802 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561294 4802 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561298 4802 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561303 4802 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561307 4802 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561312 4802 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561316 4802 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561320 4802 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561325 4802 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561329 4802 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561333 4802 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561338 4802 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561341 4802 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561345 4802 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561348 4802 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561352 4802 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561355 4802 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561359 4802 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561362 4802 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561365 4802 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561369 4802 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561372 4802 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561376 4802 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561385 4802 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561389 4802 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561393 4802 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561397 4802 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561401 4802 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561406 4802 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561411 4802 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561415 4802 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561419 4802 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561423 4802 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561427 4802 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561431 4802 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561435 4802 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561439 4802 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561443 4802 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561447 4802 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561451 4802 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561455 4802 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561459 4802 feature_gate.go:330] unrecognized feature gate: Example Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561462 4802 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561466 4802 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561470 4802 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561473 4802 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561477 4802 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561481 4802 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561484 4802 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561488 4802 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561491 4802 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561495 4802 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561498 4802 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.561501 4802 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561720 4802 flags.go:64] FLAG: --address="0.0.0.0" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561729 4802 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561739 4802 flags.go:64] FLAG: --anonymous-auth="true" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561744 4802 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561749 4802 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561760 4802 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561765 4802 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561771 4802 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561775 4802 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561779 4802 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561783 4802 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561787 4802 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561793 4802 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561798 4802 flags.go:64] FLAG: --cgroup-root="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561801 4802 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561806 4802 flags.go:64] FLAG: --client-ca-file="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561809 4802 flags.go:64] FLAG: --cloud-config="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561814 4802 flags.go:64] FLAG: --cloud-provider="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561817 4802 flags.go:64] FLAG: --cluster-dns="[]" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561825 4802 flags.go:64] FLAG: --cluster-domain="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561829 4802 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561833 4802 flags.go:64] FLAG: --config-dir="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561837 4802 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561842 4802 flags.go:64] FLAG: --container-log-max-files="5" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561847 4802 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561851 4802 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561855 4802 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561863 4802 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561867 4802 flags.go:64] FLAG: --contention-profiling="false" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561871 4802 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561875 4802 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561879 4802 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561883 4802 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561888 4802 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561892 4802 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561896 4802 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561900 4802 flags.go:64] FLAG: --enable-load-reader="false" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561904 4802 flags.go:64] FLAG: --enable-server="true" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561908 4802 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561915 4802 flags.go:64] FLAG: --event-burst="100" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561919 4802 flags.go:64] FLAG: --event-qps="50" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561930 4802 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561935 4802 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561939 4802 flags.go:64] FLAG: --eviction-hard="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561944 4802 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561948 4802 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561952 4802 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561956 4802 flags.go:64] FLAG: --eviction-soft="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561960 4802 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561964 4802 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561968 4802 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561972 4802 flags.go:64] FLAG: --experimental-mounter-path="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561976 4802 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561980 4802 flags.go:64] FLAG: --fail-swap-on="true" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561984 4802 flags.go:64] FLAG: --feature-gates="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561990 4802 flags.go:64] FLAG: --file-check-frequency="20s" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561994 4802 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.561998 4802 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562002 4802 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562006 4802 flags.go:64] FLAG: --healthz-port="10248" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562010 4802 flags.go:64] FLAG: --help="false" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562014 4802 flags.go:64] FLAG: --hostname-override="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562018 4802 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562022 4802 flags.go:64] FLAG: --http-check-frequency="20s" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562027 4802 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562030 4802 flags.go:64] FLAG: --image-credential-provider-config="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562034 4802 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562038 4802 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562042 4802 flags.go:64] FLAG: --image-service-endpoint="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562046 4802 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562050 4802 flags.go:64] FLAG: --kube-api-burst="100" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562054 4802 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562058 4802 flags.go:64] FLAG: --kube-api-qps="50" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562061 4802 flags.go:64] FLAG: --kube-reserved="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562065 4802 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562069 4802 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562074 4802 flags.go:64] FLAG: --kubelet-cgroups="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562083 4802 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562088 4802 flags.go:64] FLAG: --lock-file="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562091 4802 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562095 4802 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562099 4802 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562105 4802 flags.go:64] FLAG: --log-json-split-stream="false" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562109 4802 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562113 4802 flags.go:64] FLAG: --log-text-split-stream="false" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562117 4802 flags.go:64] FLAG: --logging-format="text" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562121 4802 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562126 4802 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562130 4802 flags.go:64] FLAG: --manifest-url="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562134 4802 flags.go:64] FLAG: --manifest-url-header="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562139 4802 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562143 4802 flags.go:64] FLAG: --max-open-files="1000000" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562148 4802 flags.go:64] FLAG: --max-pods="110" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562152 4802 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562156 4802 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562160 4802 flags.go:64] FLAG: --memory-manager-policy="None" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562164 4802 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562168 4802 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562171 4802 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562175 4802 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562184 4802 flags.go:64] FLAG: --node-status-max-images="50" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562188 4802 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562207 4802 flags.go:64] FLAG: --oom-score-adj="-999" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562211 4802 flags.go:64] FLAG: --pod-cidr="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562215 4802 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562225 4802 flags.go:64] FLAG: --pod-manifest-path="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562229 4802 flags.go:64] FLAG: --pod-max-pids="-1" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562233 4802 flags.go:64] FLAG: --pods-per-core="0" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562237 4802 flags.go:64] FLAG: --port="10250" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562241 4802 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562245 4802 flags.go:64] FLAG: --provider-id="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562249 4802 flags.go:64] FLAG: --qos-reserved="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562252 4802 flags.go:64] FLAG: --read-only-port="10255" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562264 4802 flags.go:64] FLAG: --register-node="true" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562268 4802 flags.go:64] FLAG: --register-schedulable="true" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562272 4802 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562279 4802 flags.go:64] FLAG: --registry-burst="10" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562283 4802 flags.go:64] FLAG: --registry-qps="5" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562288 4802 flags.go:64] FLAG: --reserved-cpus="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562292 4802 flags.go:64] FLAG: --reserved-memory="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562297 4802 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562301 4802 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562305 4802 flags.go:64] FLAG: --rotate-certificates="false" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562309 4802 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562313 4802 flags.go:64] FLAG: --runonce="false" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562317 4802 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562321 4802 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562326 4802 flags.go:64] FLAG: --seccomp-default="false" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562329 4802 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562333 4802 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562337 4802 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562342 4802 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562346 4802 flags.go:64] FLAG: --storage-driver-password="root" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562349 4802 flags.go:64] FLAG: --storage-driver-secure="false" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562353 4802 flags.go:64] FLAG: --storage-driver-table="stats" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562357 4802 flags.go:64] FLAG: --storage-driver-user="root" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562361 4802 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562366 4802 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562370 4802 flags.go:64] FLAG: --system-cgroups="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562374 4802 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562381 4802 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562391 4802 flags.go:64] FLAG: --tls-cert-file="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562395 4802 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562403 4802 flags.go:64] FLAG: --tls-min-version="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562408 4802 flags.go:64] FLAG: --tls-private-key-file="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562412 4802 flags.go:64] FLAG: --topology-manager-policy="none" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562416 4802 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562420 4802 flags.go:64] FLAG: --topology-manager-scope="container" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562424 4802 flags.go:64] FLAG: --v="2" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562438 4802 flags.go:64] FLAG: --version="false" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562445 4802 flags.go:64] FLAG: --vmodule="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562450 4802 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562454 4802 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562575 4802 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562579 4802 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562584 4802 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562589 4802 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562592 4802 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562604 4802 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562608 4802 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562613 4802 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562617 4802 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562621 4802 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562625 4802 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562628 4802 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562632 4802 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562635 4802 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562639 4802 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562642 4802 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562646 4802 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562650 4802 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562655 4802 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562658 4802 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562664 4802 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562667 4802 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562671 4802 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562674 4802 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562679 4802 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562683 4802 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562686 4802 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562690 4802 feature_gate.go:330] unrecognized feature gate: Example Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562693 4802 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562698 4802 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562702 4802 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562705 4802 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562714 4802 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562718 4802 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562724 4802 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562727 4802 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562731 4802 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562734 4802 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562738 4802 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562741 4802 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562745 4802 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562748 4802 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562752 4802 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562755 4802 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562758 4802 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562762 4802 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562765 4802 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562769 4802 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562772 4802 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562775 4802 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562779 4802 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562782 4802 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562787 4802 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562790 4802 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562794 4802 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562797 4802 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562801 4802 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562804 4802 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562808 4802 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562811 4802 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562814 4802 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562819 4802 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562822 4802 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562826 4802 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562829 4802 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562834 4802 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562838 4802 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562843 4802 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562852 4802 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562856 4802 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.562859 4802 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.562870 4802 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.572759 4802 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.572789 4802 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.572882 4802 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.572890 4802 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.572899 4802 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.572908 4802 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.572914 4802 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.572920 4802 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.572926 4802 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.572932 4802 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.572937 4802 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.572942 4802 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.572948 4802 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.572953 4802 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.572959 4802 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.572964 4802 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.572970 4802 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.572975 4802 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.572981 4802 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.572986 4802 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.572991 4802 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.572996 4802 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573002 4802 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573007 4802 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573012 4802 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573019 4802 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573024 4802 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573030 4802 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573035 4802 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573040 4802 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573045 4802 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573051 4802 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573057 4802 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573063 4802 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573072 4802 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573079 4802 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573087 4802 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573101 4802 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573107 4802 feature_gate.go:330] unrecognized feature gate: Example Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573112 4802 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573117 4802 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573123 4802 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573128 4802 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573133 4802 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573139 4802 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573144 4802 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573149 4802 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573156 4802 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573163 4802 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573168 4802 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573174 4802 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573180 4802 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573187 4802 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573222 4802 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573228 4802 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573234 4802 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573240 4802 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573245 4802 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573251 4802 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573257 4802 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573263 4802 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573268 4802 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573274 4802 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573280 4802 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573287 4802 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573292 4802 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573298 4802 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573303 4802 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573308 4802 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573314 4802 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573319 4802 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573324 4802 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573329 4802 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.573338 4802 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573500 4802 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573509 4802 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573515 4802 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573521 4802 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573526 4802 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573531 4802 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573536 4802 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573542 4802 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573547 4802 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573552 4802 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573558 4802 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573563 4802 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573568 4802 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573574 4802 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573579 4802 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573584 4802 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573589 4802 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573594 4802 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573600 4802 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573606 4802 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573611 4802 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573617 4802 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573623 4802 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573628 4802 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573634 4802 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573640 4802 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573645 4802 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573650 4802 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573656 4802 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573661 4802 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573666 4802 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573671 4802 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573676 4802 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573683 4802 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573688 4802 feature_gate.go:330] unrecognized feature gate: Example Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573693 4802 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573698 4802 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573705 4802 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573711 4802 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573718 4802 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573723 4802 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573729 4802 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573734 4802 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573739 4802 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573744 4802 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573749 4802 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573755 4802 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573760 4802 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573765 4802 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573772 4802 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573779 4802 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573785 4802 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573792 4802 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573799 4802 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573808 4802 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573813 4802 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573821 4802 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573827 4802 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573832 4802 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573837 4802 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573842 4802 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573848 4802 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573853 4802 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573860 4802 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573866 4802 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573873 4802 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573879 4802 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573884 4802 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573889 4802 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573895 4802 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.573900 4802 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.573908 4802 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.574089 4802 server.go:940] "Client rotation is on, will bootstrap in background" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.577463 4802 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.577564 4802 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.578339 4802 server.go:997] "Starting client certificate rotation" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.578364 4802 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.578712 4802 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-10 03:08:43.278785166 +0000 UTC Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.578789 4802 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 199h12m24.699999182s for next certificate rotation Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.584043 4802 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.586729 4802 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.596373 4802 log.go:25] "Validated CRI v1 runtime API" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.616319 4802 log.go:25] "Validated CRI v1 image API" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.618183 4802 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.620699 4802 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-01-19-51-41-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.620730 4802 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.641521 4802 manager.go:217] Machine: {Timestamp:2025-12-01 19:56:18.639184955 +0000 UTC m=+0.201744696 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:7ca05b31-e838-4494-a138-5a5047e18b0e BootID:ccabb53f-70cd-48e6-8bc8-8247c89db90c Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:e2:1e:58 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:e2:1e:58 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:6e:59:c8 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:61:3a:c8 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:40:dc:3b Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:68:44:98 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:20:a3:80 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:b6:fc:ac:9e:ad:63 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ba:38:8a:42:3a:89 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.641939 4802 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.642155 4802 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.643139 4802 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.643565 4802 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.643625 4802 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.643985 4802 topology_manager.go:138] "Creating topology manager with none policy" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.644005 4802 container_manager_linux.go:303] "Creating device plugin manager" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.644370 4802 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.644425 4802 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.644671 4802 state_mem.go:36] "Initialized new in-memory state store" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.645144 4802 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.646058 4802 kubelet.go:418] "Attempting to sync node with API server" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.646096 4802 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.646137 4802 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.646159 4802 kubelet.go:324] "Adding apiserver pod source" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.646181 4802 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.648530 4802 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.649076 4802 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.649484 4802 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.649567 4802 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 01 19:56:18 crc kubenswrapper[4802]: E1201 19:56:18.649653 4802 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Dec 01 19:56:18 crc kubenswrapper[4802]: E1201 19:56:18.649724 4802 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.650344 4802 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.651142 4802 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.651187 4802 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.651237 4802 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.651256 4802 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.651286 4802 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.651304 4802 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.651321 4802 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.651349 4802 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.651371 4802 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.651386 4802 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.651440 4802 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.651454 4802 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.652308 4802 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.653144 4802 server.go:1280] "Started kubelet" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.653611 4802 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.654045 4802 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.654406 4802 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 01 19:56:18 crc systemd[1]: Started Kubernetes Kubelet. Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.655511 4802 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.656583 4802 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.656617 4802 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.657161 4802 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.657189 4802 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 01 19:56:18 crc kubenswrapper[4802]: E1201 19:56:18.657403 4802 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.657052 4802 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 11:22:59.918433573 +0000 UTC Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.661336 4802 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1047h26m41.257111485s for next certificate rotation Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.662540 4802 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 01 19:56:18 crc kubenswrapper[4802]: E1201 19:56:18.662602 4802 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Dec 01 19:56:18 crc kubenswrapper[4802]: E1201 19:56:18.662779 4802 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="200ms" Dec 01 19:56:18 crc kubenswrapper[4802]: E1201 19:56:18.661907 4802 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.151:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187d2f9c9a879975 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 19:56:18.653084021 +0000 UTC m=+0.215643722,LastTimestamp:2025-12-01 19:56:18.653084021 +0000 UTC m=+0.215643722,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.662969 4802 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.663406 4802 factory.go:153] Registering CRI-O factory Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.663456 4802 factory.go:221] Registration of the crio container factory successfully Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.663658 4802 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.663690 4802 factory.go:55] Registering systemd factory Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.663708 4802 factory.go:221] Registration of the systemd container factory successfully Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.663865 4802 factory.go:103] Registering Raw factory Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.664161 4802 manager.go:1196] Started watching for new ooms in manager Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.664757 4802 server.go:460] "Adding debug handlers to kubelet server" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.667180 4802 manager.go:319] Starting recovery of all containers Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.674548 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.674615 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.674638 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.674657 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.674677 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.674696 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.674716 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.674735 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.674791 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.674813 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.674833 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.674851 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.674870 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.674893 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.674911 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.674930 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.674950 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.674968 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.674985 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675003 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675020 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675039 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675059 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675087 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675107 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675126 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675148 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675168 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675222 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675242 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675259 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675278 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675311 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675357 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675377 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675395 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675413 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675432 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675451 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675471 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675489 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675508 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675527 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675545 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675563 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675583 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675602 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675620 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675641 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675658 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675683 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675701 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675724 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675744 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675766 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675786 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675807 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675827 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675846 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675863 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675880 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675897 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675915 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675933 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675951 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675970 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.675988 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676007 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676025 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676042 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676061 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676079 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676096 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676115 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676133 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676153 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676172 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676189 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676233 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676252 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676272 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676290 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676312 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676330 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676348 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676366 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676386 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676403 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676421 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676440 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676458 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676475 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676492 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676512 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676530 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676550 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676570 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676587 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676605 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676623 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676642 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676661 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676680 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676700 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676725 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676744 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676764 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676783 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676803 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676825 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.676845 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.677658 4802 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.677697 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.677721 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.677743 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.677763 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.677783 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.677804 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.677823 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.677842 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.677861 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.677879 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.677897 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.677916 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.677934 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.677953 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.677971 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.677989 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678008 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678027 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678045 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678064 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678082 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678101 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678119 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678164 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678184 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678231 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678249 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678269 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678287 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678306 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678323 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678342 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678362 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678380 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678411 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678429 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678449 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678467 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678488 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678510 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678528 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678546 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678564 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678584 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678604 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678623 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678643 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678661 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678680 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678699 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678717 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678735 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678789 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678810 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678830 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678849 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678867 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678888 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678908 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678926 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678944 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678963 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.678982 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679000 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679019 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679037 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679061 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679078 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679099 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679116 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679136 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679154 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679173 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679217 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679236 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679255 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679273 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679291 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679309 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679327 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679347 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679365 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679383 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679401 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679421 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679438 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679457 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679475 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679493 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679512 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679531 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679550 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679568 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679587 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679605 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679624 4802 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679644 4802 reconstruct.go:97] "Volume reconstruction finished" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.679657 4802 reconciler.go:26] "Reconciler: start to sync state" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.697426 4802 manager.go:324] Recovery completed Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.712650 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.715844 4802 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.717023 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.717069 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.717097 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.718628 4802 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.718718 4802 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.718750 4802 kubelet.go:2335] "Starting kubelet main sync loop" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.718683 4802 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 01 19:56:18 crc kubenswrapper[4802]: E1201 19:56:18.718808 4802 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.718838 4802 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.718884 4802 state_mem.go:36] "Initialized new in-memory state store" Dec 01 19:56:18 crc kubenswrapper[4802]: W1201 19:56:18.719614 4802 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 01 19:56:18 crc kubenswrapper[4802]: E1201 19:56:18.719691 4802 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.727751 4802 policy_none.go:49] "None policy: Start" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.728459 4802 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.728488 4802 state_mem.go:35] "Initializing new in-memory state store" Dec 01 19:56:18 crc kubenswrapper[4802]: E1201 19:56:18.761367 4802 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.790450 4802 manager.go:334] "Starting Device Plugin manager" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.790820 4802 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.790834 4802 server.go:79] "Starting device plugin registration server" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.791186 4802 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.791287 4802 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.791636 4802 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.791710 4802 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.791716 4802 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 01 19:56:18 crc kubenswrapper[4802]: E1201 19:56:18.801844 4802 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.819762 4802 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.819825 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.820775 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.820869 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.820892 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.821376 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.821461 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.821487 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.822181 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.822223 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.822231 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.823088 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.823107 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.823115 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.823220 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.823423 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.823463 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.824140 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.824151 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.824170 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.824176 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.824178 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.824185 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.824327 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.824354 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.824374 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.825096 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.825133 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.825143 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.825351 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.825380 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.825391 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.825483 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.825545 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.825571 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.826312 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.826334 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.826345 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.826462 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.826481 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:18 crc kubenswrapper[4802]: E1201 19:56:18.826551 4802 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.151:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187d2f9c9a879975 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 19:56:18.653084021 +0000 UTC m=+0.215643722,LastTimestamp:2025-12-01 19:56:18.653084021 +0000 UTC m=+0.215643722,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.826834 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.827042 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.827062 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.827070 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.827298 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.828287 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:18 crc kubenswrapper[4802]: E1201 19:56:18.863873 4802 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="400ms" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.882270 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.882534 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.882714 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.882856 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.882994 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.883132 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.883357 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.883582 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.883732 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.883877 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.884013 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.884154 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.884327 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.884479 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.884625 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.892424 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.893779 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.893829 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.893847 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.893880 4802 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 19:56:18 crc kubenswrapper[4802]: E1201 19:56:18.894513 4802 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.985978 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.986263 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.986367 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.986439 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.986437 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.986465 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.986659 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.986738 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.986797 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.986811 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.986577 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.986608 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.986945 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.987028 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.987125 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.987192 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.987233 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.987376 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.987451 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.987520 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.987535 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.987649 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.987626 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.987596 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.987754 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.987840 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.987893 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.987898 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.987966 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 19:56:18 crc kubenswrapper[4802]: I1201 19:56:18.987996 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.094758 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.096259 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.096417 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.096556 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.096652 4802 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 19:56:19 crc kubenswrapper[4802]: E1201 19:56:19.097191 4802 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.162830 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.169628 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 01 19:56:19 crc kubenswrapper[4802]: W1201 19:56:19.187345 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-ac62b8ed988cbd0157381421feb6a80b13e3af3a2e67eac39b0018fc0cfc7773 WatchSource:0}: Error finding container ac62b8ed988cbd0157381421feb6a80b13e3af3a2e67eac39b0018fc0cfc7773: Status 404 returned error can't find the container with id ac62b8ed988cbd0157381421feb6a80b13e3af3a2e67eac39b0018fc0cfc7773 Dec 01 19:56:19 crc kubenswrapper[4802]: W1201 19:56:19.188744 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-460e3dd179f83231f3ed4ac11132c7669663028956bc8c6ca48c1c6160b075d6 WatchSource:0}: Error finding container 460e3dd179f83231f3ed4ac11132c7669663028956bc8c6ca48c1c6160b075d6: Status 404 returned error can't find the container with id 460e3dd179f83231f3ed4ac11132c7669663028956bc8c6ca48c1c6160b075d6 Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.202775 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 19:56:19 crc kubenswrapper[4802]: W1201 19:56:19.222014 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-7fdb6242b9f5f63e132021ab7d4cffd6438fc6f992c99a4c2d67f51d268f82aa WatchSource:0}: Error finding container 7fdb6242b9f5f63e132021ab7d4cffd6438fc6f992c99a4c2d67f51d268f82aa: Status 404 returned error can't find the container with id 7fdb6242b9f5f63e132021ab7d4cffd6438fc6f992c99a4c2d67f51d268f82aa Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.229404 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.237575 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 19:56:19 crc kubenswrapper[4802]: W1201 19:56:19.245005 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-5d75a3287025f02f275d1d93977e3b731e8711deb29bf3b57a3f9d8875e8a7b5 WatchSource:0}: Error finding container 5d75a3287025f02f275d1d93977e3b731e8711deb29bf3b57a3f9d8875e8a7b5: Status 404 returned error can't find the container with id 5d75a3287025f02f275d1d93977e3b731e8711deb29bf3b57a3f9d8875e8a7b5 Dec 01 19:56:19 crc kubenswrapper[4802]: W1201 19:56:19.252596 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-8ed6e0453332b5ed93e38e5abdac14086ff296ea4352b382f3040bce420795f8 WatchSource:0}: Error finding container 8ed6e0453332b5ed93e38e5abdac14086ff296ea4352b382f3040bce420795f8: Status 404 returned error can't find the container with id 8ed6e0453332b5ed93e38e5abdac14086ff296ea4352b382f3040bce420795f8 Dec 01 19:56:19 crc kubenswrapper[4802]: E1201 19:56:19.265100 4802 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="800ms" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.497467 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.498483 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.498507 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.498516 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.498535 4802 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 19:56:19 crc kubenswrapper[4802]: E1201 19:56:19.499185 4802 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Dec 01 19:56:19 crc kubenswrapper[4802]: W1201 19:56:19.645711 4802 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 01 19:56:19 crc kubenswrapper[4802]: E1201 19:56:19.646061 4802 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.654729 4802 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.725634 4802 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9" exitCode=0 Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.725787 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9"} Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.725971 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5d75a3287025f02f275d1d93977e3b731e8711deb29bf3b57a3f9d8875e8a7b5"} Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.726135 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.727738 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.727802 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.727824 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.728222 4802 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7" exitCode=0 Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.728294 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7"} Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.728330 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7fdb6242b9f5f63e132021ab7d4cffd6438fc6f992c99a4c2d67f51d268f82aa"} Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.728424 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.729563 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.729596 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.729607 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.730766 4802 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ab4a806b9002a57bdaaea82b14656b90dc50b57f1e6246fc5d46d6977ca2a6dc" exitCode=0 Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.730830 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ab4a806b9002a57bdaaea82b14656b90dc50b57f1e6246fc5d46d6977ca2a6dc"} Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.730856 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ac62b8ed988cbd0157381421feb6a80b13e3af3a2e67eac39b0018fc0cfc7773"} Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.730947 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.731169 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.731701 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.731755 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.731778 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.732623 4802 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="0247fe32e9c3d59c78a0e9abe0e4597eed112df1efdf3ada9cf424e426b912ac" exitCode=0 Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.732666 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"0247fe32e9c3d59c78a0e9abe0e4597eed112df1efdf3ada9cf424e426b912ac"} Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.732684 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"460e3dd179f83231f3ed4ac11132c7669663028956bc8c6ca48c1c6160b075d6"} Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.732733 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.733492 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.733510 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.733514 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.733565 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.733539 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.733647 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.735560 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8"} Dec 01 19:56:19 crc kubenswrapper[4802]: I1201 19:56:19.735583 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8ed6e0453332b5ed93e38e5abdac14086ff296ea4352b382f3040bce420795f8"} Dec 01 19:56:19 crc kubenswrapper[4802]: W1201 19:56:19.869561 4802 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 01 19:56:19 crc kubenswrapper[4802]: E1201 19:56:19.869636 4802 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Dec 01 19:56:20 crc kubenswrapper[4802]: E1201 19:56:20.066729 4802 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="1.6s" Dec 01 19:56:20 crc kubenswrapper[4802]: W1201 19:56:20.168803 4802 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 01 19:56:20 crc kubenswrapper[4802]: E1201 19:56:20.168897 4802 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Dec 01 19:56:20 crc kubenswrapper[4802]: W1201 19:56:20.212230 4802 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 01 19:56:20 crc kubenswrapper[4802]: E1201 19:56:20.212330 4802 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.299764 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.301688 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.301720 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.301728 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.301757 4802 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.740575 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798"} Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.740632 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff"} Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.740646 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.740645 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954"} Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.741602 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.741635 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.741643 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.743799 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0eaeea38dc683a9459f24ab9a0de9c3701ae217e27b425b7eed25530cb7523af"} Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.743841 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"68fa39b8f5c24877fa42088699094e8bdce24f0679d8c50294f569a771789ccd"} Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.743858 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"72a6498825f7c8427eefd172eebe1f4b8dad960f1d474c78c06f901f637ecd3c"} Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.744045 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.745454 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.745482 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.745492 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.748028 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a"} Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.748058 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd"} Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.748070 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d"} Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.748083 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c"} Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.748094 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9"} Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.748244 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.749632 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.749664 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.749679 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.749681 4802 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="93e59e9bb0da3a1e48726a0c6f1c5f2dcba025d1c1d53a933e0dd7fd4db5581e" exitCode=0 Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.749742 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"93e59e9bb0da3a1e48726a0c6f1c5f2dcba025d1c1d53a933e0dd7fd4db5581e"} Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.749846 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.750463 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.750482 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.750489 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.751831 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"4442bba04936832a31754ab2a26103c31da700a120cd81b45dcf53c004e9b46a"} Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.751909 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.753433 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.753455 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:20 crc kubenswrapper[4802]: I1201 19:56:20.753466 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:21 crc kubenswrapper[4802]: I1201 19:56:21.757422 4802 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="10c024cb304aa37cf5334bbf84b35495cedd0e751a199697d050f6804da2516e" exitCode=0 Dec 01 19:56:21 crc kubenswrapper[4802]: I1201 19:56:21.757621 4802 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 19:56:21 crc kubenswrapper[4802]: I1201 19:56:21.757671 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:21 crc kubenswrapper[4802]: I1201 19:56:21.757868 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"10c024cb304aa37cf5334bbf84b35495cedd0e751a199697d050f6804da2516e"} Dec 01 19:56:21 crc kubenswrapper[4802]: I1201 19:56:21.757974 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:21 crc kubenswrapper[4802]: I1201 19:56:21.758010 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:21 crc kubenswrapper[4802]: I1201 19:56:21.759179 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:21 crc kubenswrapper[4802]: I1201 19:56:21.759225 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:21 crc kubenswrapper[4802]: I1201 19:56:21.759235 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:21 crc kubenswrapper[4802]: I1201 19:56:21.759258 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:21 crc kubenswrapper[4802]: I1201 19:56:21.759304 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:21 crc kubenswrapper[4802]: I1201 19:56:21.759333 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:21 crc kubenswrapper[4802]: I1201 19:56:21.759437 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:21 crc kubenswrapper[4802]: I1201 19:56:21.759471 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:21 crc kubenswrapper[4802]: I1201 19:56:21.759594 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:22 crc kubenswrapper[4802]: I1201 19:56:22.336916 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 19:56:22 crc kubenswrapper[4802]: I1201 19:56:22.764769 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:22 crc kubenswrapper[4802]: I1201 19:56:22.764897 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6c49e47677142d724eb56b940158bc7ff30886cb98911431ef3a3ecd63969ea6"} Dec 01 19:56:22 crc kubenswrapper[4802]: I1201 19:56:22.765065 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4bfbe6b9c9e36b730ca3b4a6a42c15028518d500a9e0743c0d2dd8626e06f4b6"} Dec 01 19:56:22 crc kubenswrapper[4802]: I1201 19:56:22.765094 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cef2d82c0514ca36725246db0e4a9a5c5015f34a0c280689d0d3e0cbcde56b88"} Dec 01 19:56:22 crc kubenswrapper[4802]: I1201 19:56:22.765110 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d445731c7cee7ec09d220df6418d3b6752b956d188c3d853ac470dfa11037747"} Dec 01 19:56:22 crc kubenswrapper[4802]: I1201 19:56:22.765853 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:22 crc kubenswrapper[4802]: I1201 19:56:22.765905 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:22 crc kubenswrapper[4802]: I1201 19:56:22.765923 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:23 crc kubenswrapper[4802]: I1201 19:56:23.396705 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 19:56:23 crc kubenswrapper[4802]: I1201 19:56:23.396853 4802 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 19:56:23 crc kubenswrapper[4802]: I1201 19:56:23.396894 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:23 crc kubenswrapper[4802]: I1201 19:56:23.398263 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:23 crc kubenswrapper[4802]: I1201 19:56:23.398308 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:23 crc kubenswrapper[4802]: I1201 19:56:23.398318 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:23 crc kubenswrapper[4802]: I1201 19:56:23.770744 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dc45f69bad64d259b95e47b43b39bbcd70edd29f4ef79a96f44b8c65df41c124"} Dec 01 19:56:23 crc kubenswrapper[4802]: I1201 19:56:23.770866 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:23 crc kubenswrapper[4802]: I1201 19:56:23.771724 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:23 crc kubenswrapper[4802]: I1201 19:56:23.771775 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:23 crc kubenswrapper[4802]: I1201 19:56:23.771797 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:24 crc kubenswrapper[4802]: I1201 19:56:24.096580 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 19:56:24 crc kubenswrapper[4802]: I1201 19:56:24.096733 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:24 crc kubenswrapper[4802]: I1201 19:56:24.098633 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:24 crc kubenswrapper[4802]: I1201 19:56:24.098715 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:24 crc kubenswrapper[4802]: I1201 19:56:24.098857 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:24 crc kubenswrapper[4802]: I1201 19:56:24.725286 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 19:56:24 crc kubenswrapper[4802]: I1201 19:56:24.731678 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 19:56:24 crc kubenswrapper[4802]: I1201 19:56:24.773183 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:24 crc kubenswrapper[4802]: I1201 19:56:24.773351 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:24 crc kubenswrapper[4802]: I1201 19:56:24.774238 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:24 crc kubenswrapper[4802]: I1201 19:56:24.774274 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:24 crc kubenswrapper[4802]: I1201 19:56:24.774285 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:24 crc kubenswrapper[4802]: I1201 19:56:24.774853 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:24 crc kubenswrapper[4802]: I1201 19:56:24.774911 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:24 crc kubenswrapper[4802]: I1201 19:56:24.774936 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:24 crc kubenswrapper[4802]: I1201 19:56:24.814615 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 19:56:24 crc kubenswrapper[4802]: I1201 19:56:24.814743 4802 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 19:56:24 crc kubenswrapper[4802]: I1201 19:56:24.814776 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:24 crc kubenswrapper[4802]: I1201 19:56:24.815942 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:24 crc kubenswrapper[4802]: I1201 19:56:24.816000 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:24 crc kubenswrapper[4802]: I1201 19:56:24.816021 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:25 crc kubenswrapper[4802]: I1201 19:56:25.337633 4802 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 19:56:25 crc kubenswrapper[4802]: I1201 19:56:25.337758 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 19:56:25 crc kubenswrapper[4802]: I1201 19:56:25.380643 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 01 19:56:25 crc kubenswrapper[4802]: I1201 19:56:25.775259 4802 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 19:56:25 crc kubenswrapper[4802]: I1201 19:56:25.775318 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:25 crc kubenswrapper[4802]: I1201 19:56:25.775429 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:25 crc kubenswrapper[4802]: I1201 19:56:25.776730 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:25 crc kubenswrapper[4802]: I1201 19:56:25.776807 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:25 crc kubenswrapper[4802]: I1201 19:56:25.776831 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:25 crc kubenswrapper[4802]: I1201 19:56:25.777078 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:25 crc kubenswrapper[4802]: I1201 19:56:25.777151 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:25 crc kubenswrapper[4802]: I1201 19:56:25.777171 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:26 crc kubenswrapper[4802]: I1201 19:56:26.733947 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 01 19:56:26 crc kubenswrapper[4802]: I1201 19:56:26.775971 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 19:56:26 crc kubenswrapper[4802]: I1201 19:56:26.778629 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:26 crc kubenswrapper[4802]: I1201 19:56:26.778672 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:26 crc kubenswrapper[4802]: I1201 19:56:26.780471 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:26 crc kubenswrapper[4802]: I1201 19:56:26.780532 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:26 crc kubenswrapper[4802]: I1201 19:56:26.780549 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:26 crc kubenswrapper[4802]: I1201 19:56:26.780561 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:26 crc kubenswrapper[4802]: I1201 19:56:26.780597 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:26 crc kubenswrapper[4802]: I1201 19:56:26.780615 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:26 crc kubenswrapper[4802]: I1201 19:56:26.799290 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 19:56:26 crc kubenswrapper[4802]: I1201 19:56:26.799491 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:26 crc kubenswrapper[4802]: I1201 19:56:26.800819 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:26 crc kubenswrapper[4802]: I1201 19:56:26.800875 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:26 crc kubenswrapper[4802]: I1201 19:56:26.800892 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:27 crc kubenswrapper[4802]: I1201 19:56:27.543608 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 19:56:27 crc kubenswrapper[4802]: I1201 19:56:27.543842 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:27 crc kubenswrapper[4802]: I1201 19:56:27.545480 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:27 crc kubenswrapper[4802]: I1201 19:56:27.545567 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:27 crc kubenswrapper[4802]: I1201 19:56:27.545598 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:28 crc kubenswrapper[4802]: E1201 19:56:28.802602 4802 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 01 19:56:30 crc kubenswrapper[4802]: E1201 19:56:30.303638 4802 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 01 19:56:30 crc kubenswrapper[4802]: I1201 19:56:30.655808 4802 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 01 19:56:31 crc kubenswrapper[4802]: I1201 19:56:31.160896 4802 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 01 19:56:31 crc kubenswrapper[4802]: I1201 19:56:31.160995 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 01 19:56:31 crc kubenswrapper[4802]: I1201 19:56:31.167658 4802 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 01 19:56:31 crc kubenswrapper[4802]: I1201 19:56:31.167740 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 01 19:56:31 crc kubenswrapper[4802]: I1201 19:56:31.904293 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:31 crc kubenswrapper[4802]: I1201 19:56:31.906637 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:31 crc kubenswrapper[4802]: I1201 19:56:31.906716 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:31 crc kubenswrapper[4802]: I1201 19:56:31.906739 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:31 crc kubenswrapper[4802]: I1201 19:56:31.906791 4802 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 19:56:34 crc kubenswrapper[4802]: I1201 19:56:34.825025 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 19:56:34 crc kubenswrapper[4802]: I1201 19:56:34.825352 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:34 crc kubenswrapper[4802]: I1201 19:56:34.827535 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:34 crc kubenswrapper[4802]: I1201 19:56:34.827596 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:34 crc kubenswrapper[4802]: I1201 19:56:34.827619 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:34 crc kubenswrapper[4802]: I1201 19:56:34.832388 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 19:56:35 crc kubenswrapper[4802]: I1201 19:56:35.338676 4802 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 19:56:35 crc kubenswrapper[4802]: I1201 19:56:35.338808 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 01 19:56:35 crc kubenswrapper[4802]: I1201 19:56:35.424461 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 01 19:56:35 crc kubenswrapper[4802]: I1201 19:56:35.424759 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:35 crc kubenswrapper[4802]: I1201 19:56:35.426499 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:35 crc kubenswrapper[4802]: I1201 19:56:35.426545 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:35 crc kubenswrapper[4802]: I1201 19:56:35.426560 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:35 crc kubenswrapper[4802]: I1201 19:56:35.445709 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 01 19:56:35 crc kubenswrapper[4802]: I1201 19:56:35.805657 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:35 crc kubenswrapper[4802]: I1201 19:56:35.806779 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:35 crc kubenswrapper[4802]: I1201 19:56:35.807920 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:35 crc kubenswrapper[4802]: I1201 19:56:35.807970 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:35 crc kubenswrapper[4802]: I1201 19:56:35.807982 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:35 crc kubenswrapper[4802]: I1201 19:56:35.814361 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:35 crc kubenswrapper[4802]: I1201 19:56:35.816456 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:35 crc kubenswrapper[4802]: I1201 19:56:35.816475 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:36 crc kubenswrapper[4802]: E1201 19:56:36.174441 4802 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.176439 4802 trace.go:236] Trace[1063761626]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 19:56:22.025) (total time: 14150ms): Dec 01 19:56:36 crc kubenswrapper[4802]: Trace[1063761626]: ---"Objects listed" error: 14150ms (19:56:36.176) Dec 01 19:56:36 crc kubenswrapper[4802]: Trace[1063761626]: [14.150557568s] [14.150557568s] END Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.176498 4802 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.177430 4802 trace.go:236] Trace[1660930979]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 19:56:22.434) (total time: 13742ms): Dec 01 19:56:36 crc kubenswrapper[4802]: Trace[1660930979]: ---"Objects listed" error: 13742ms (19:56:36.177) Dec 01 19:56:36 crc kubenswrapper[4802]: Trace[1660930979]: [13.742686982s] [13.742686982s] END Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.177467 4802 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.178362 4802 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.179316 4802 trace.go:236] Trace[1997069083]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 19:56:21.900) (total time: 14278ms): Dec 01 19:56:36 crc kubenswrapper[4802]: Trace[1997069083]: ---"Objects listed" error: 14278ms (19:56:36.179) Dec 01 19:56:36 crc kubenswrapper[4802]: Trace[1997069083]: [14.278786908s] [14.278786908s] END Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.179351 4802 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.180263 4802 trace.go:236] Trace[184206612]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 19:56:22.005) (total time: 14174ms): Dec 01 19:56:36 crc kubenswrapper[4802]: Trace[184206612]: ---"Objects listed" error: 14174ms (19:56:36.180) Dec 01 19:56:36 crc kubenswrapper[4802]: Trace[184206612]: [14.174271353s] [14.174271353s] END Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.180297 4802 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.425566 4802 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": EOF" start-of-body= Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.425633 4802 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": EOF" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.431184 4802 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:34570->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.431308 4802 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:34570->192.168.126.11:17697: read: connection reset by peer" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.657088 4802 apiserver.go:52] "Watching apiserver" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.659753 4802 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.660021 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.660357 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.660434 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:56:36 crc kubenswrapper[4802]: E1201 19:56:36.660495 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.660541 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:56:36 crc kubenswrapper[4802]: E1201 19:56:36.660573 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.660709 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.660739 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.660759 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 19:56:36 crc kubenswrapper[4802]: E1201 19:56:36.660776 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.663241 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.663281 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.663765 4802 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.664284 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.665104 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.665156 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.665272 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.665683 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.666395 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.666575 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.681166 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.681259 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.681299 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.681338 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.681379 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.681416 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.681459 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.681510 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.681565 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.681613 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.681667 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.681710 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.681609 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.681758 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.681813 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.681869 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.681920 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.681968 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.682018 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.682073 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.682115 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.682159 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.682246 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.682319 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.682373 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.682503 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.682574 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.682643 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.682695 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.682744 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.682791 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.682896 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.682943 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.682986 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.683047 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.683251 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.681715 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.683324 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.681839 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.682397 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.682476 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.682635 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.683377 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.683432 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.683478 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.683526 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.683579 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.683629 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.683679 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.683733 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.683785 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.683835 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.683884 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.684028 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.684112 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.686084 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.686174 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.686251 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.686286 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.686325 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.686357 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.686380 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.686409 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.686442 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.686464 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.687460 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.687522 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.687587 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.687620 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.687754 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.688067 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.692569 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.696127 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.682645 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.682635 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.682861 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.682976 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.696453 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.696103 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.683034 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.683028 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.683293 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.683416 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.683624 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.683701 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.684134 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.686056 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.686396 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.686696 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.687350 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.687643 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.687930 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.688342 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.688682 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.689024 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.690145 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.690795 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.692353 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.692488 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.693252 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.693416 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.693601 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.693932 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.694115 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.694391 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.694403 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.694664 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.695121 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.695638 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.696009 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.696067 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.696113 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.694232 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.696743 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.696987 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.697455 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.697644 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.697792 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.698269 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 19:56:36 crc kubenswrapper[4802]: E1201 19:56:36.698440 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:56:37.198366491 +0000 UTC m=+18.760926132 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.699803 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.699999 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.700175 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.699869 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.700369 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.700511 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.700606 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.700664 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.700736 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.700840 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.700979 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.701519 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.701913 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.702014 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.702052 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.702483 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.702541 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.702635 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.702991 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.703336 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.703653 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.703783 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.704041 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.704113 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.704863 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.705085 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.704831 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.705835 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.705894 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.705929 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.705966 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.705998 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.706022 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.706050 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.706081 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.706111 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.706139 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.706170 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.706214 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.706242 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.706277 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.706303 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.705826 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.706651 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.706779 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.706813 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.706853 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.706942 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.706982 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.707230 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.707266 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.707279 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.707337 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.707376 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.707420 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.707580 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.705123 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.707430 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.708059 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.708128 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.708141 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.708231 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.708232 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.708446 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.708451 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.708482 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.708736 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.709266 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.709340 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.709500 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.709614 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.709685 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.709692 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.709703 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.709722 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.709787 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.709851 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.709921 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.709986 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.710055 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.710110 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.710117 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.710232 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.710311 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.710362 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.710407 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.710448 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.710540 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.710598 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.710673 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.710748 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.710830 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.710887 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.710945 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.711010 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.711065 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.711135 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.711234 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.711299 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.711362 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.711434 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.711500 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.711556 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.711621 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.711699 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.711766 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.711829 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.711890 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.711987 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.712062 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.712124 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.712183 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.712270 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.712329 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.712407 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.712480 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.712547 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.712640 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.712724 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.712798 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.712861 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.712912 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.712971 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.713033 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.713098 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.713169 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.713305 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.713404 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.713461 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.710589 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.710691 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.710727 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.710669 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.711798 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.711901 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.712250 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.712903 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.713032 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.713691 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.714383 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.709287 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.714539 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.715263 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.716769 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.717542 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.718127 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.718313 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.718322 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.718963 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.719056 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.719104 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.719148 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.719188 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.719257 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.719298 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.719334 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.719373 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.719392 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.719410 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.719506 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.719536 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.719633 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.719657 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.719678 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.719701 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.719727 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.720028 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.720057 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.720083 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.720104 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.720123 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.720141 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.720162 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.720184 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.720218 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.720235 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.720259 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.720277 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.720293 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.720309 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.720327 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.720450 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.720507 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.720734 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.720992 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.721210 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.721520 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.721581 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.721834 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.721883 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.722273 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.722642 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.724217 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.724323 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:56:36 crc kubenswrapper[4802]: E1201 19:56:36.724843 4802 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.724976 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: E1201 19:56:36.725119 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 19:56:37.225077345 +0000 UTC m=+18.787637026 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.725273 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.725381 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.725557 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.725638 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.725658 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.725695 4802 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.726165 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.726324 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.726355 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.726497 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:56:36 crc kubenswrapper[4802]: E1201 19:56:36.726565 4802 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.726631 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.726790 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.726801 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 19:56:36 crc kubenswrapper[4802]: E1201 19:56:36.727039 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 19:56:37.227010868 +0000 UTC m=+18.789570519 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.727273 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.727311 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.727257 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.727627 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.727670 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.727738 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.729171 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.729369 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.729690 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.730440 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.731592 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.731713 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.733112 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.733182 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.733288 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.734124 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.735131 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.735292 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736302 4802 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736340 4802 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736353 4802 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736368 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736391 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736404 4802 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736416 4802 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736429 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736445 4802 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736457 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736468 4802 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736480 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736490 4802 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736502 4802 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736514 4802 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736526 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736537 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736548 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736561 4802 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736572 4802 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736580 4802 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736590 4802 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736600 4802 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736611 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736622 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736632 4802 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736642 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736652 4802 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736854 4802 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736865 4802 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736875 4802 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736886 4802 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736897 4802 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736908 4802 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736919 4802 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736931 4802 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736942 4802 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736958 4802 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736969 4802 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736982 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736992 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737002 4802 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737025 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737035 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737095 4802 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737153 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737164 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737175 4802 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737186 4802 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737209 4802 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737221 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737232 4802 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737245 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737256 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737267 4802 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737278 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737288 4802 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737299 4802 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737311 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737326 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737336 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737346 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737357 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737367 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737378 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737403 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737412 4802 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737423 4802 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737434 4802 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737445 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737456 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737466 4802 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737477 4802 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737487 4802 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737497 4802 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737505 4802 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737514 4802 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737524 4802 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737533 4802 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737542 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737551 4802 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737560 4802 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737570 4802 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737580 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737589 4802 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737599 4802 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737610 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737619 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737628 4802 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737637 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737649 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737658 4802 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737667 4802 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737678 4802 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737688 4802 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737698 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737707 4802 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737716 4802 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737725 4802 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737923 4802 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737932 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737942 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737952 4802 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737961 4802 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737973 4802 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737983 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737992 4802 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.738017 4802 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.738027 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.738037 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.738048 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.738057 4802 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.738068 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.738077 4802 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.738085 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.738094 4802 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.738102 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.738112 4802 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.738122 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.738131 4802 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.738140 4802 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736304 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736393 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736858 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.736918 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.737527 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.738360 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.738575 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.738868 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.740097 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.740350 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.741742 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.742389 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.743429 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.744148 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.744804 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.745859 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.746277 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.746292 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.746388 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.746979 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.748654 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.749403 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.750400 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.750648 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: E1201 19:56:36.750731 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 19:56:36 crc kubenswrapper[4802]: E1201 19:56:36.750759 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 19:56:36 crc kubenswrapper[4802]: E1201 19:56:36.750774 4802 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:56:36 crc kubenswrapper[4802]: E1201 19:56:36.750849 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 19:56:37.250827449 +0000 UTC m=+18.813387090 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.750885 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.751183 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.751143 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.751575 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.751952 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.752688 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.754222 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.755032 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.755192 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.756768 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.756980 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.759902 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.760088 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.760219 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.760420 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.760591 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.760695 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: E1201 19:56:36.760880 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 19:56:36 crc kubenswrapper[4802]: E1201 19:56:36.760918 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 19:56:36 crc kubenswrapper[4802]: E1201 19:56:36.760972 4802 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:56:36 crc kubenswrapper[4802]: E1201 19:56:36.761042 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 19:56:37.261017979 +0000 UTC m=+18.823577640 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.761265 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.761530 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.761659 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.763022 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.763573 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.764340 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.766302 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.768016 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.768385 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.769101 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.770053 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.771777 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.772736 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.774374 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.776030 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.776761 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.776948 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.777377 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.777616 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.777886 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.778127 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.778385 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.778442 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.779018 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.779371 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.779578 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.779955 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.781140 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.781533 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.781768 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.781785 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.781951 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.781982 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.782697 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.784558 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.784919 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.785147 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.785373 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.785758 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.788478 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.788556 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.789459 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.790217 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.790873 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.791001 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.791028 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.791431 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.791894 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.792166 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.792524 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.793653 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.793665 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.799859 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.799937 4802 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.799990 4802 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.800085 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.802364 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.802617 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.804965 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.806729 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.810022 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.810189 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.811726 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.818109 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.818990 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.820368 4802 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a" exitCode=255 Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.821016 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a"} Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.826006 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839125 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839176 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839242 4802 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839268 4802 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839279 4802 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839289 4802 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839300 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839309 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839319 4802 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839328 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839336 4802 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839345 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839358 4802 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839370 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839379 4802 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839388 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839396 4802 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839411 4802 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839422 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839433 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839443 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839451 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839459 4802 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839468 4802 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839477 4802 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839486 4802 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839495 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839505 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839515 4802 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839525 4802 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839535 4802 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839544 4802 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839557 4802 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839570 4802 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839580 4802 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839593 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839603 4802 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839612 4802 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839620 4802 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839635 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839643 4802 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839655 4802 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839664 4802 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839677 4802 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839687 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839695 4802 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839707 4802 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839715 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839723 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839733 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839750 4802 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839764 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839776 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839785 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839794 4802 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839803 4802 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839812 4802 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839821 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839830 4802 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839839 4802 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839848 4802 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839858 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839867 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839878 4802 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839888 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839896 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839905 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839923 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839931 4802 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839940 4802 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839947 4802 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839956 4802 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839966 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839976 4802 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.839984 4802 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.840174 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.840771 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.851914 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.882084 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.895776 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.899402 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.899573 4802 scope.go:117] "RemoveContainer" containerID="2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a" Dec 01 19:56:36 crc kubenswrapper[4802]: E1201 19:56:36.911757 4802 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.913248 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.924596 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.946186 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.962077 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.979433 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.981638 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 19:56:36 crc kubenswrapper[4802]: I1201 19:56:36.992097 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 19:56:37 crc kubenswrapper[4802]: W1201 19:56:36.999449 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-1bf655fbf4ede82680d9029ebed230c6dd5afd499b939f91853f5169d8e47fb4 WatchSource:0}: Error finding container 1bf655fbf4ede82680d9029ebed230c6dd5afd499b939f91853f5169d8e47fb4: Status 404 returned error can't find the container with id 1bf655fbf4ede82680d9029ebed230c6dd5afd499b939f91853f5169d8e47fb4 Dec 01 19:56:37 crc kubenswrapper[4802]: I1201 19:56:37.019792 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 19:56:37 crc kubenswrapper[4802]: I1201 19:56:37.022088 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 19:56:37 crc kubenswrapper[4802]: W1201 19:56:37.042093 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-fd56c431c7f9fd276d43e176610457e231d9a9f0c5136462bbaf4b7e2b374ef0 WatchSource:0}: Error finding container fd56c431c7f9fd276d43e176610457e231d9a9f0c5136462bbaf4b7e2b374ef0: Status 404 returned error can't find the container with id fd56c431c7f9fd276d43e176610457e231d9a9f0c5136462bbaf4b7e2b374ef0 Dec 01 19:56:37 crc kubenswrapper[4802]: I1201 19:56:37.051573 4802 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 19:56:37 crc kubenswrapper[4802]: I1201 19:56:37.243228 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:56:37 crc kubenswrapper[4802]: E1201 19:56:37.243314 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:56:38.243286415 +0000 UTC m=+19.805846056 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:56:37 crc kubenswrapper[4802]: I1201 19:56:37.243350 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:56:37 crc kubenswrapper[4802]: I1201 19:56:37.243387 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:56:37 crc kubenswrapper[4802]: E1201 19:56:37.243560 4802 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 19:56:37 crc kubenswrapper[4802]: E1201 19:56:37.243629 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 19:56:38.243610016 +0000 UTC m=+19.806169657 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 19:56:37 crc kubenswrapper[4802]: E1201 19:56:37.243668 4802 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 19:56:37 crc kubenswrapper[4802]: E1201 19:56:37.243691 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 19:56:38.243685248 +0000 UTC m=+19.806244889 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 19:56:37 crc kubenswrapper[4802]: I1201 19:56:37.344820 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:56:37 crc kubenswrapper[4802]: I1201 19:56:37.344904 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:56:37 crc kubenswrapper[4802]: E1201 19:56:37.345094 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 19:56:37 crc kubenswrapper[4802]: E1201 19:56:37.345145 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 19:56:37 crc kubenswrapper[4802]: E1201 19:56:37.345148 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 19:56:37 crc kubenswrapper[4802]: E1201 19:56:37.345221 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 19:56:37 crc kubenswrapper[4802]: E1201 19:56:37.345163 4802 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:56:37 crc kubenswrapper[4802]: E1201 19:56:37.345239 4802 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:56:37 crc kubenswrapper[4802]: E1201 19:56:37.345294 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 19:56:38.345274376 +0000 UTC m=+19.907834007 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:56:37 crc kubenswrapper[4802]: E1201 19:56:37.345328 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 19:56:38.345304337 +0000 UTC m=+19.907863978 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:56:37 crc kubenswrapper[4802]: I1201 19:56:37.719545 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:56:37 crc kubenswrapper[4802]: E1201 19:56:37.719884 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:56:37 crc kubenswrapper[4802]: I1201 19:56:37.823887 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4947603563b4fa3afed935e09c397270d5ac831c227722bda6756df42d15ca84"} Dec 01 19:56:37 crc kubenswrapper[4802]: I1201 19:56:37.826055 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385"} Dec 01 19:56:37 crc kubenswrapper[4802]: I1201 19:56:37.826145 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"1bf655fbf4ede82680d9029ebed230c6dd5afd499b939f91853f5169d8e47fb4"} Dec 01 19:56:37 crc kubenswrapper[4802]: I1201 19:56:37.829290 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 19:56:37 crc kubenswrapper[4802]: I1201 19:56:37.831164 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2"} Dec 01 19:56:37 crc kubenswrapper[4802]: I1201 19:56:37.831511 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 19:56:37 crc kubenswrapper[4802]: I1201 19:56:37.832504 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214"} Dec 01 19:56:37 crc kubenswrapper[4802]: I1201 19:56:37.832554 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a"} Dec 01 19:56:37 crc kubenswrapper[4802]: I1201 19:56:37.832575 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"fd56c431c7f9fd276d43e176610457e231d9a9f0c5136462bbaf4b7e2b374ef0"} Dec 01 19:56:37 crc kubenswrapper[4802]: I1201 19:56:37.856128 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:37Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:37 crc kubenswrapper[4802]: I1201 19:56:37.889886 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:37Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:37 crc kubenswrapper[4802]: I1201 19:56:37.930828 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:37Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:37 crc kubenswrapper[4802]: I1201 19:56:37.969217 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:37Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:37 crc kubenswrapper[4802]: I1201 19:56:37.999441 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:37Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.038838 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.074324 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.116472 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.134465 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.151892 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.164253 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.179742 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.201624 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.208415 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-tw4xd"] Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.208814 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-htfwc"] Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.209033 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.209284 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-9wvdw"] Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.209440 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-htfwc" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.209749 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-8zl28"] Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.209893 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9wvdw" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.210028 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.211961 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.212697 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.212745 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.212703 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.213004 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.216493 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.216539 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.216543 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.216622 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.216729 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.216795 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.217003 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.217036 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.217134 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.217250 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.219457 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.241237 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.252494 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.252599 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.252632 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:56:38 crc kubenswrapper[4802]: E1201 19:56:38.252791 4802 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 19:56:38 crc kubenswrapper[4802]: E1201 19:56:38.252863 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 19:56:40.252843357 +0000 UTC m=+21.815402998 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 19:56:38 crc kubenswrapper[4802]: E1201 19:56:38.252940 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:56:40.25293025 +0000 UTC m=+21.815489891 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:56:38 crc kubenswrapper[4802]: E1201 19:56:38.252976 4802 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 19:56:38 crc kubenswrapper[4802]: E1201 19:56:38.253001 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 19:56:40.252994972 +0000 UTC m=+21.815554603 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.261380 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.276776 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.297087 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.312338 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.325273 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.338323 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.350832 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353055 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-system-cni-dir\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353109 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-multus-cni-dir\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353213 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-host-var-lib-cni-multus\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353240 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zcvx\" (UniqueName: \"kubernetes.io/projected/3fb49bb8-3d0a-4ff5-80bf-60c34f310345-kube-api-access-8zcvx\") pod \"multus-additional-cni-plugins-htfwc\" (UID: \"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\") " pod="openshift-multus/multus-additional-cni-plugins-htfwc" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353264 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g58lj\" (UniqueName: \"kubernetes.io/projected/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-kube-api-access-g58lj\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353285 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hch6z\" (UniqueName: \"kubernetes.io/projected/23e1ef99-f507-42ea-a076-4fc1681c7e8c-kube-api-access-hch6z\") pod \"machine-config-daemon-tw4xd\" (UID: \"23e1ef99-f507-42ea-a076-4fc1681c7e8c\") " pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353303 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3c338aa9-4647-4436-aaf2-d7b1d85b9219-hosts-file\") pod \"node-resolver-9wvdw\" (UID: \"3c338aa9-4647-4436-aaf2-d7b1d85b9219\") " pod="openshift-dns/node-resolver-9wvdw" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353322 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3fb49bb8-3d0a-4ff5-80bf-60c34f310345-cnibin\") pod \"multus-additional-cni-plugins-htfwc\" (UID: \"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\") " pod="openshift-multus/multus-additional-cni-plugins-htfwc" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353340 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-host-run-k8s-cni-cncf-io\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353358 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-host-var-lib-cni-bin\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353395 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3fb49bb8-3d0a-4ff5-80bf-60c34f310345-os-release\") pod \"multus-additional-cni-plugins-htfwc\" (UID: \"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\") " pod="openshift-multus/multus-additional-cni-plugins-htfwc" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353410 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-etc-kubernetes\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353426 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fb49bb8-3d0a-4ff5-80bf-60c34f310345-system-cni-dir\") pod \"multus-additional-cni-plugins-htfwc\" (UID: \"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\") " pod="openshift-multus/multus-additional-cni-plugins-htfwc" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353440 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-multus-conf-dir\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353456 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-host-run-multus-certs\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353476 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-hostroot\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353498 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353517 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3fb49bb8-3d0a-4ff5-80bf-60c34f310345-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-htfwc\" (UID: \"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\") " pod="openshift-multus/multus-additional-cni-plugins-htfwc" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353542 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3fb49bb8-3d0a-4ff5-80bf-60c34f310345-cni-binary-copy\") pod \"multus-additional-cni-plugins-htfwc\" (UID: \"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\") " pod="openshift-multus/multus-additional-cni-plugins-htfwc" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353557 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7p9c\" (UniqueName: \"kubernetes.io/projected/3c338aa9-4647-4436-aaf2-d7b1d85b9219-kube-api-access-w7p9c\") pod \"node-resolver-9wvdw\" (UID: \"3c338aa9-4647-4436-aaf2-d7b1d85b9219\") " pod="openshift-dns/node-resolver-9wvdw" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353573 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/23e1ef99-f507-42ea-a076-4fc1681c7e8c-mcd-auth-proxy-config\") pod \"machine-config-daemon-tw4xd\" (UID: \"23e1ef99-f507-42ea-a076-4fc1681c7e8c\") " pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353589 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/23e1ef99-f507-42ea-a076-4fc1681c7e8c-rootfs\") pod \"machine-config-daemon-tw4xd\" (UID: \"23e1ef99-f507-42ea-a076-4fc1681c7e8c\") " pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353603 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-cnibin\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353616 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-os-release\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353635 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3fb49bb8-3d0a-4ff5-80bf-60c34f310345-tuning-conf-dir\") pod \"multus-additional-cni-plugins-htfwc\" (UID: \"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\") " pod="openshift-multus/multus-additional-cni-plugins-htfwc" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353655 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/23e1ef99-f507-42ea-a076-4fc1681c7e8c-proxy-tls\") pod \"machine-config-daemon-tw4xd\" (UID: \"23e1ef99-f507-42ea-a076-4fc1681c7e8c\") " pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353721 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-multus-socket-dir-parent\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353801 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-host-run-netns\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353831 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353853 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-multus-daemon-config\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353872 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-cni-binary-copy\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: E1201 19:56:38.353875 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.353889 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-host-var-lib-kubelet\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: E1201 19:56:38.353895 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 19:56:38 crc kubenswrapper[4802]: E1201 19:56:38.353923 4802 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:56:38 crc kubenswrapper[4802]: E1201 19:56:38.354056 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 19:56:38 crc kubenswrapper[4802]: E1201 19:56:38.354073 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 19:56:38 crc kubenswrapper[4802]: E1201 19:56:38.354082 4802 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:56:38 crc kubenswrapper[4802]: E1201 19:56:38.354114 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 19:56:40.354095693 +0000 UTC m=+21.916655324 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:56:38 crc kubenswrapper[4802]: E1201 19:56:38.354233 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 19:56:40.354214097 +0000 UTC m=+21.916773738 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.363852 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.382020 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.395091 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.408652 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.420057 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.434600 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.454897 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fb49bb8-3d0a-4ff5-80bf-60c34f310345-system-cni-dir\") pod \"multus-additional-cni-plugins-htfwc\" (UID: \"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\") " pod="openshift-multus/multus-additional-cni-plugins-htfwc" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.454952 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-multus-conf-dir\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.454980 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-host-run-multus-certs\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.455006 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-hostroot\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.455044 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3fb49bb8-3d0a-4ff5-80bf-60c34f310345-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-htfwc\" (UID: \"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\") " pod="openshift-multus/multus-additional-cni-plugins-htfwc" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.455077 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3fb49bb8-3d0a-4ff5-80bf-60c34f310345-cni-binary-copy\") pod \"multus-additional-cni-plugins-htfwc\" (UID: \"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\") " pod="openshift-multus/multus-additional-cni-plugins-htfwc" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.455129 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7p9c\" (UniqueName: \"kubernetes.io/projected/3c338aa9-4647-4436-aaf2-d7b1d85b9219-kube-api-access-w7p9c\") pod \"node-resolver-9wvdw\" (UID: \"3c338aa9-4647-4436-aaf2-d7b1d85b9219\") " pod="openshift-dns/node-resolver-9wvdw" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.455159 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/23e1ef99-f507-42ea-a076-4fc1681c7e8c-mcd-auth-proxy-config\") pod \"machine-config-daemon-tw4xd\" (UID: \"23e1ef99-f507-42ea-a076-4fc1681c7e8c\") " pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.455183 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/23e1ef99-f507-42ea-a076-4fc1681c7e8c-rootfs\") pod \"machine-config-daemon-tw4xd\" (UID: \"23e1ef99-f507-42ea-a076-4fc1681c7e8c\") " pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.455363 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-cnibin\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.455121 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-multus-conf-dir\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.455480 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-os-release\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.455250 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-hostroot\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.455206 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-host-run-multus-certs\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.455464 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-cnibin\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.455311 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/23e1ef99-f507-42ea-a076-4fc1681c7e8c-rootfs\") pod \"machine-config-daemon-tw4xd\" (UID: \"23e1ef99-f507-42ea-a076-4fc1681c7e8c\") " pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.455954 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fb49bb8-3d0a-4ff5-80bf-60c34f310345-system-cni-dir\") pod \"multus-additional-cni-plugins-htfwc\" (UID: \"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\") " pod="openshift-multus/multus-additional-cni-plugins-htfwc" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.456130 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/23e1ef99-f507-42ea-a076-4fc1681c7e8c-mcd-auth-proxy-config\") pod \"machine-config-daemon-tw4xd\" (UID: \"23e1ef99-f507-42ea-a076-4fc1681c7e8c\") " pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.456157 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-os-release\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.456191 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3fb49bb8-3d0a-4ff5-80bf-60c34f310345-tuning-conf-dir\") pod \"multus-additional-cni-plugins-htfwc\" (UID: \"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\") " pod="openshift-multus/multus-additional-cni-plugins-htfwc" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.456247 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/23e1ef99-f507-42ea-a076-4fc1681c7e8c-proxy-tls\") pod \"machine-config-daemon-tw4xd\" (UID: \"23e1ef99-f507-42ea-a076-4fc1681c7e8c\") " pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.456270 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-multus-socket-dir-parent\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.456291 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3fb49bb8-3d0a-4ff5-80bf-60c34f310345-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-htfwc\" (UID: \"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\") " pod="openshift-multus/multus-additional-cni-plugins-htfwc" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.456310 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-host-run-netns\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.456361 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-host-run-netns\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.456380 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-multus-daemon-config\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.456411 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-cni-binary-copy\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.456428 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-host-var-lib-kubelet\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.456462 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-system-cni-dir\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.456480 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-multus-cni-dir\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.456478 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3fb49bb8-3d0a-4ff5-80bf-60c34f310345-cni-binary-copy\") pod \"multus-additional-cni-plugins-htfwc\" (UID: \"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\") " pod="openshift-multus/multus-additional-cni-plugins-htfwc" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.456498 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-host-var-lib-cni-multus\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.456541 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-host-var-lib-cni-multus\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.456559 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-host-var-lib-kubelet\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.456594 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-multus-socket-dir-parent\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.456616 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3fb49bb8-3d0a-4ff5-80bf-60c34f310345-tuning-conf-dir\") pod \"multus-additional-cni-plugins-htfwc\" (UID: \"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\") " pod="openshift-multus/multus-additional-cni-plugins-htfwc" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.456583 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zcvx\" (UniqueName: \"kubernetes.io/projected/3fb49bb8-3d0a-4ff5-80bf-60c34f310345-kube-api-access-8zcvx\") pod \"multus-additional-cni-plugins-htfwc\" (UID: \"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\") " pod="openshift-multus/multus-additional-cni-plugins-htfwc" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.456746 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-system-cni-dir\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.456785 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g58lj\" (UniqueName: \"kubernetes.io/projected/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-kube-api-access-g58lj\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.456847 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-multus-cni-dir\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.457014 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hch6z\" (UniqueName: \"kubernetes.io/projected/23e1ef99-f507-42ea-a076-4fc1681c7e8c-kube-api-access-hch6z\") pod \"machine-config-daemon-tw4xd\" (UID: \"23e1ef99-f507-42ea-a076-4fc1681c7e8c\") " pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.457049 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3c338aa9-4647-4436-aaf2-d7b1d85b9219-hosts-file\") pod \"node-resolver-9wvdw\" (UID: \"3c338aa9-4647-4436-aaf2-d7b1d85b9219\") " pod="openshift-dns/node-resolver-9wvdw" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.457077 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3fb49bb8-3d0a-4ff5-80bf-60c34f310345-cnibin\") pod \"multus-additional-cni-plugins-htfwc\" (UID: \"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\") " pod="openshift-multus/multus-additional-cni-plugins-htfwc" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.457096 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-host-run-k8s-cni-cncf-io\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.457116 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-host-var-lib-cni-bin\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.457133 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3fb49bb8-3d0a-4ff5-80bf-60c34f310345-os-release\") pod \"multus-additional-cni-plugins-htfwc\" (UID: \"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\") " pod="openshift-multus/multus-additional-cni-plugins-htfwc" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.457154 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-etc-kubernetes\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.457171 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-host-run-k8s-cni-cncf-io\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.457183 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3c338aa9-4647-4436-aaf2-d7b1d85b9219-hosts-file\") pod \"node-resolver-9wvdw\" (UID: \"3c338aa9-4647-4436-aaf2-d7b1d85b9219\") " pod="openshift-dns/node-resolver-9wvdw" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.457132 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-multus-daemon-config\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.457157 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3fb49bb8-3d0a-4ff5-80bf-60c34f310345-cnibin\") pod \"multus-additional-cni-plugins-htfwc\" (UID: \"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\") " pod="openshift-multus/multus-additional-cni-plugins-htfwc" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.457242 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3fb49bb8-3d0a-4ff5-80bf-60c34f310345-os-release\") pod \"multus-additional-cni-plugins-htfwc\" (UID: \"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\") " pod="openshift-multus/multus-additional-cni-plugins-htfwc" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.457254 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-etc-kubernetes\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.457288 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-host-var-lib-cni-bin\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.457325 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-cni-binary-copy\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.461215 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/23e1ef99-f507-42ea-a076-4fc1681c7e8c-proxy-tls\") pod \"machine-config-daemon-tw4xd\" (UID: \"23e1ef99-f507-42ea-a076-4fc1681c7e8c\") " pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.474548 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hch6z\" (UniqueName: \"kubernetes.io/projected/23e1ef99-f507-42ea-a076-4fc1681c7e8c-kube-api-access-hch6z\") pod \"machine-config-daemon-tw4xd\" (UID: \"23e1ef99-f507-42ea-a076-4fc1681c7e8c\") " pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.477391 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7p9c\" (UniqueName: \"kubernetes.io/projected/3c338aa9-4647-4436-aaf2-d7b1d85b9219-kube-api-access-w7p9c\") pod \"node-resolver-9wvdw\" (UID: \"3c338aa9-4647-4436-aaf2-d7b1d85b9219\") " pod="openshift-dns/node-resolver-9wvdw" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.479259 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g58lj\" (UniqueName: \"kubernetes.io/projected/bd82ca15-4489-4c15-aaf0-afb6b6787dc6-kube-api-access-g58lj\") pod \"multus-8zl28\" (UID: \"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\") " pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.479497 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zcvx\" (UniqueName: \"kubernetes.io/projected/3fb49bb8-3d0a-4ff5-80bf-60c34f310345-kube-api-access-8zcvx\") pod \"multus-additional-cni-plugins-htfwc\" (UID: \"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\") " pod="openshift-multus/multus-additional-cni-plugins-htfwc" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.522097 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.528166 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-htfwc" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.534781 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9wvdw" Dec 01 19:56:38 crc kubenswrapper[4802]: W1201 19:56:38.537453 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23e1ef99_f507_42ea_a076_4fc1681c7e8c.slice/crio-dc29b028a9214b76f62f9ae1c1844a03f960fe1740ec6b419acf489a53f9a04f WatchSource:0}: Error finding container dc29b028a9214b76f62f9ae1c1844a03f960fe1740ec6b419acf489a53f9a04f: Status 404 returned error can't find the container with id dc29b028a9214b76f62f9ae1c1844a03f960fe1740ec6b419acf489a53f9a04f Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.539676 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8zl28" Dec 01 19:56:38 crc kubenswrapper[4802]: W1201 19:56:38.545004 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fb49bb8_3d0a_4ff5_80bf_60c34f310345.slice/crio-4de7157fad02b36b3ab9836d12934ba8fa1aaba88e8fec3f831ed5736953f55f WatchSource:0}: Error finding container 4de7157fad02b36b3ab9836d12934ba8fa1aaba88e8fec3f831ed5736953f55f: Status 404 returned error can't find the container with id 4de7157fad02b36b3ab9836d12934ba8fa1aaba88e8fec3f831ed5736953f55f Dec 01 19:56:38 crc kubenswrapper[4802]: W1201 19:56:38.554554 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c338aa9_4647_4436_aaf2_d7b1d85b9219.slice/crio-8f0b532fca2fdab7bc4a5ffa0678b58f9a0421736ad61838fb4a951fab4381f4 WatchSource:0}: Error finding container 8f0b532fca2fdab7bc4a5ffa0678b58f9a0421736ad61838fb4a951fab4381f4: Status 404 returned error can't find the container with id 8f0b532fca2fdab7bc4a5ffa0678b58f9a0421736ad61838fb4a951fab4381f4 Dec 01 19:56:38 crc kubenswrapper[4802]: W1201 19:56:38.562241 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd82ca15_4489_4c15_aaf0_afb6b6787dc6.slice/crio-22786326ccb601e98c7c6a31e1e98efbc8927d241c829ccbaaf4fd10a82e0e75 WatchSource:0}: Error finding container 22786326ccb601e98c7c6a31e1e98efbc8927d241c829ccbaaf4fd10a82e0e75: Status 404 returned error can't find the container with id 22786326ccb601e98c7c6a31e1e98efbc8927d241c829ccbaaf4fd10a82e0e75 Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.578635 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-t7nr2"] Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.579800 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.583764 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.587476 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.587983 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.588350 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.588636 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.589023 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.589433 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.599546 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.614220 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.626375 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.651215 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.676991 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.691410 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.706321 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.720145 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:56:38 crc kubenswrapper[4802]: E1201 19:56:38.720611 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.720938 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:56:38 crc kubenswrapper[4802]: E1201 19:56:38.720986 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.722046 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.728820 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.729452 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.730696 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.731277 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.731755 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.733296 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.733885 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.734817 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.735219 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.735579 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.736811 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.737266 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.738321 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.738840 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.739315 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.740898 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.741324 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.742477 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.743025 4802 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.743135 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.745703 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.746248 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.747526 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.748578 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.749799 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.750524 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.751591 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.754075 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.760142 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-kubelet\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.760176 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-cni-bin\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.762796 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-run-ovn\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.762837 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9xgd\" (UniqueName: \"kubernetes.io/projected/933fb25a-a01a-464e-838a-df1d07bca99e-kube-api-access-d9xgd\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.762854 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-systemd-units\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.762870 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-run-netns\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.762890 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-var-lib-openvswitch\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.762905 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-run-openvswitch\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.762919 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-cni-netd\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.762946 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-slash\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.762966 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-node-log\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.762980 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-run-ovn-kubernetes\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.762994 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-log-socket\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.763008 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/933fb25a-a01a-464e-838a-df1d07bca99e-ovnkube-script-lib\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.763026 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-run-systemd\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.763098 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-etc-openvswitch\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.763123 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.763150 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/933fb25a-a01a-464e-838a-df1d07bca99e-ovn-node-metrics-cert\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.763172 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/933fb25a-a01a-464e-838a-df1d07bca99e-ovnkube-config\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.763210 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/933fb25a-a01a-464e-838a-df1d07bca99e-env-overrides\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.768603 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.778963 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.828283 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.863916 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/933fb25a-a01a-464e-838a-df1d07bca99e-env-overrides\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.863957 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/933fb25a-a01a-464e-838a-df1d07bca99e-ovnkube-config\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.863984 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-kubelet\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.863998 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-cni-bin\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.864020 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-run-ovn\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.864042 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9xgd\" (UniqueName: \"kubernetes.io/projected/933fb25a-a01a-464e-838a-df1d07bca99e-kube-api-access-d9xgd\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.864058 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-systemd-units\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.864073 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-run-netns\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.864087 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-var-lib-openvswitch\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.864101 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-slash\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.864115 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-run-openvswitch\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.864130 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-cni-netd\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.864151 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-run-ovn-kubernetes\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.864165 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-node-log\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.864178 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-log-socket\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.864192 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/933fb25a-a01a-464e-838a-df1d07bca99e-ovnkube-script-lib\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.864258 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.864278 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-run-systemd\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.864291 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-etc-openvswitch\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.864306 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/933fb25a-a01a-464e-838a-df1d07bca99e-ovn-node-metrics-cert\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.865772 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8zl28" event={"ID":"bd82ca15-4489-4c15-aaf0-afb6b6787dc6","Type":"ContainerStarted","Data":"f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3"} Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.865847 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8zl28" event={"ID":"bd82ca15-4489-4c15-aaf0-afb6b6787dc6","Type":"ContainerStarted","Data":"22786326ccb601e98c7c6a31e1e98efbc8927d241c829ccbaaf4fd10a82e0e75"} Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.867786 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-slash\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.868455 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/933fb25a-a01a-464e-838a-df1d07bca99e-env-overrides\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.868611 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.868857 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-log-socket\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.868873 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/933fb25a-a01a-464e-838a-df1d07bca99e-ovnkube-config\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.868885 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.868907 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-run-openvswitch\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.868910 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-kubelet\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.868928 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-cni-bin\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.868950 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-cni-netd\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.868955 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-run-ovn\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.868975 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-run-ovn-kubernetes\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.869002 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-node-log\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.869026 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-run-netns\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.869048 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-systemd-units\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.869074 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-run-systemd\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.869096 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-var-lib-openvswitch\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.869120 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-etc-openvswitch\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.869599 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/933fb25a-a01a-464e-838a-df1d07bca99e-ovnkube-script-lib\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.874464 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" event={"ID":"3fb49bb8-3d0a-4ff5-80bf-60c34f310345","Type":"ContainerStarted","Data":"4de7157fad02b36b3ab9836d12934ba8fa1aaba88e8fec3f831ed5736953f55f"} Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.882707 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/933fb25a-a01a-464e-838a-df1d07bca99e-ovn-node-metrics-cert\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.892411 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerStarted","Data":"5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c"} Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.892469 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerStarted","Data":"3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809"} Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.892482 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerStarted","Data":"dc29b028a9214b76f62f9ae1c1844a03f960fe1740ec6b419acf489a53f9a04f"} Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.896520 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.897183 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9xgd\" (UniqueName: \"kubernetes.io/projected/933fb25a-a01a-464e-838a-df1d07bca99e-kube-api-access-d9xgd\") pod \"ovnkube-node-t7nr2\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.901418 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9wvdw" event={"ID":"3c338aa9-4647-4436-aaf2-d7b1d85b9219","Type":"ContainerStarted","Data":"8f0b532fca2fdab7bc4a5ffa0678b58f9a0421736ad61838fb4a951fab4381f4"} Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.923337 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.931551 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.944736 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.958963 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:38 crc kubenswrapper[4802]: I1201 19:56:38.978164 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.000252 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.021874 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:39Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.039509 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:39Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.051091 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:39Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.065042 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:39Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.083940 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:39Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.098996 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:39Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.112759 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:39Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.126622 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:39Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.140774 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:39Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.158118 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:39Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.177226 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:39Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.195488 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:39Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.214262 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:39Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.226946 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:39Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.247746 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:39Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.274402 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:39Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.287554 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:39Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.298537 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:39Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.314489 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:39Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.719236 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:56:39 crc kubenswrapper[4802]: E1201 19:56:39.719853 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.905367 4802 generic.go:334] "Generic (PLEG): container finished" podID="933fb25a-a01a-464e-838a-df1d07bca99e" containerID="c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a" exitCode=0 Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.905404 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" event={"ID":"933fb25a-a01a-464e-838a-df1d07bca99e","Type":"ContainerDied","Data":"c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a"} Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.905462 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" event={"ID":"933fb25a-a01a-464e-838a-df1d07bca99e","Type":"ContainerStarted","Data":"6638e6250b02a048e64f9109c1084cd29aa6dec4c71066d84b55be130ca8d575"} Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.906948 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3"} Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.908549 4802 generic.go:334] "Generic (PLEG): container finished" podID="3fb49bb8-3d0a-4ff5-80bf-60c34f310345" containerID="ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8" exitCode=0 Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.908600 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" event={"ID":"3fb49bb8-3d0a-4ff5-80bf-60c34f310345","Type":"ContainerDied","Data":"ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8"} Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.911036 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9wvdw" event={"ID":"3c338aa9-4647-4436-aaf2-d7b1d85b9219","Type":"ContainerStarted","Data":"bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e"} Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.919170 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:39Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.940628 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:39Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.953661 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:39Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.966829 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:39Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.978834 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:39Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:39 crc kubenswrapper[4802]: I1201 19:56:39.993813 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:39Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.016960 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.030423 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.041892 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.071416 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.101485 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.111979 4802 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.113841 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.113875 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.113885 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.113974 4802 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.114529 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.148638 4802 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.148847 4802 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.149834 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.149873 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.149882 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.149898 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.149908 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:40Z","lastTransitionTime":"2025-12-01T19:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.154813 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: E1201 19:56:40.174356 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.177791 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.178412 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.178449 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.178463 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.178484 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.178499 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:40Z","lastTransitionTime":"2025-12-01T19:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.188954 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: E1201 19:56:40.190606 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.195629 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.195806 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.195889 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.195977 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.196044 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:40Z","lastTransitionTime":"2025-12-01T19:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.200910 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: E1201 19:56:40.207088 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.210484 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.210553 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.210573 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.210582 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.210596 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.210605 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:40Z","lastTransitionTime":"2025-12-01T19:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.224491 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: E1201 19:56:40.227735 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.232138 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.232172 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.232180 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.232215 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.232224 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:40Z","lastTransitionTime":"2025-12-01T19:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.238684 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: E1201 19:56:40.244477 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: E1201 19:56:40.244590 4802 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.246437 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.246479 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.246505 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.246521 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.246532 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:40Z","lastTransitionTime":"2025-12-01T19:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.256382 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.270361 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.284063 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.284210 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.284267 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:56:40 crc kubenswrapper[4802]: E1201 19:56:40.284362 4802 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 19:56:40 crc kubenswrapper[4802]: E1201 19:56:40.284412 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 19:56:44.284397941 +0000 UTC m=+25.846957583 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 19:56:40 crc kubenswrapper[4802]: E1201 19:56:40.284730 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:56:44.284720272 +0000 UTC m=+25.847279913 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:56:40 crc kubenswrapper[4802]: E1201 19:56:40.284792 4802 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 19:56:40 crc kubenswrapper[4802]: E1201 19:56:40.284816 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 19:56:44.284810345 +0000 UTC m=+25.847369986 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.290609 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.302547 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.314340 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.331948 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.346257 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.352872 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.352912 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.352922 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.352937 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.352947 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:40Z","lastTransitionTime":"2025-12-01T19:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.385648 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.385704 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:56:40 crc kubenswrapper[4802]: E1201 19:56:40.385822 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 19:56:40 crc kubenswrapper[4802]: E1201 19:56:40.385840 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 19:56:40 crc kubenswrapper[4802]: E1201 19:56:40.385864 4802 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:56:40 crc kubenswrapper[4802]: E1201 19:56:40.385925 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 19:56:44.385906817 +0000 UTC m=+25.948466458 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:56:40 crc kubenswrapper[4802]: E1201 19:56:40.385960 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 19:56:40 crc kubenswrapper[4802]: E1201 19:56:40.386005 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 19:56:40 crc kubenswrapper[4802]: E1201 19:56:40.386024 4802 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:56:40 crc kubenswrapper[4802]: E1201 19:56:40.386125 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 19:56:44.386092473 +0000 UTC m=+25.948652114 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.455215 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.455255 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.455263 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.455280 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.455289 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:40Z","lastTransitionTime":"2025-12-01T19:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.557685 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.557735 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.557748 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.557765 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.557777 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:40Z","lastTransitionTime":"2025-12-01T19:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.659762 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.659800 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.659813 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.659829 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.659841 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:40Z","lastTransitionTime":"2025-12-01T19:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.719933 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.719933 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:56:40 crc kubenswrapper[4802]: E1201 19:56:40.720134 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:56:40 crc kubenswrapper[4802]: E1201 19:56:40.720189 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.762986 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.763025 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.763036 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.763054 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.763064 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:40Z","lastTransitionTime":"2025-12-01T19:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.865457 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.865497 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.865506 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.865520 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.865532 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:40Z","lastTransitionTime":"2025-12-01T19:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.917377 4802 generic.go:334] "Generic (PLEG): container finished" podID="3fb49bb8-3d0a-4ff5-80bf-60c34f310345" containerID="48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b" exitCode=0 Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.917459 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" event={"ID":"3fb49bb8-3d0a-4ff5-80bf-60c34f310345","Type":"ContainerDied","Data":"48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b"} Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.922842 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" event={"ID":"933fb25a-a01a-464e-838a-df1d07bca99e","Type":"ContainerStarted","Data":"f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f"} Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.922907 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" event={"ID":"933fb25a-a01a-464e-838a-df1d07bca99e","Type":"ContainerStarted","Data":"2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119"} Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.922920 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" event={"ID":"933fb25a-a01a-464e-838a-df1d07bca99e","Type":"ContainerStarted","Data":"0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2"} Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.922931 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" event={"ID":"933fb25a-a01a-464e-838a-df1d07bca99e","Type":"ContainerStarted","Data":"ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6"} Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.922951 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" event={"ID":"933fb25a-a01a-464e-838a-df1d07bca99e","Type":"ContainerStarted","Data":"3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3"} Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.922961 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" event={"ID":"933fb25a-a01a-464e-838a-df1d07bca99e","Type":"ContainerStarted","Data":"5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7"} Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.932817 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.948385 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.960453 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.969544 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.969615 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.969630 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.969656 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.969679 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:40Z","lastTransitionTime":"2025-12-01T19:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.973304 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:40 crc kubenswrapper[4802]: I1201 19:56:40.992580 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:40Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.004572 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:41Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.016055 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:41Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.031730 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:41Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.059318 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:41Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.075304 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.075380 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.075400 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.075436 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.075459 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:41Z","lastTransitionTime":"2025-12-01T19:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.076038 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:41Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.093885 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:41Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.107278 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:41Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.125283 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:41Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.178553 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.178615 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.178628 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.178650 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.178664 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:41Z","lastTransitionTime":"2025-12-01T19:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.281389 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.281431 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.281446 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.281463 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.281474 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:41Z","lastTransitionTime":"2025-12-01T19:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.384899 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.384964 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.384980 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.385005 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.385022 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:41Z","lastTransitionTime":"2025-12-01T19:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.487487 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.487549 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.487566 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.487597 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.487616 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:41Z","lastTransitionTime":"2025-12-01T19:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.591120 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.591184 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.591239 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.591265 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.591283 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:41Z","lastTransitionTime":"2025-12-01T19:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.693862 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.693917 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.693930 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.693946 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.693962 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:41Z","lastTransitionTime":"2025-12-01T19:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.719324 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:56:41 crc kubenswrapper[4802]: E1201 19:56:41.719596 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.797172 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.797258 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.797274 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.797297 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.797318 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:41Z","lastTransitionTime":"2025-12-01T19:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.899593 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.899639 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.899651 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.899671 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.899683 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:41Z","lastTransitionTime":"2025-12-01T19:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.927299 4802 generic.go:334] "Generic (PLEG): container finished" podID="3fb49bb8-3d0a-4ff5-80bf-60c34f310345" containerID="d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720" exitCode=0 Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.927339 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" event={"ID":"3fb49bb8-3d0a-4ff5-80bf-60c34f310345","Type":"ContainerDied","Data":"d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720"} Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.941031 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:41Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.956132 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:41Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:41 crc kubenswrapper[4802]: I1201 19:56:41.973259 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:41Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.000483 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:41Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.002475 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.002510 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.002523 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.002543 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.002555 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:42Z","lastTransitionTime":"2025-12-01T19:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.018818 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.032497 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.049289 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.061916 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.073250 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.086244 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.102418 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.105437 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.105482 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.105504 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.105525 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.105536 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:42Z","lastTransitionTime":"2025-12-01T19:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.116502 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.133422 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.207670 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.207711 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.207727 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.207747 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.207761 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:42Z","lastTransitionTime":"2025-12-01T19:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.310269 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.310349 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.310366 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.310390 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.310404 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:42Z","lastTransitionTime":"2025-12-01T19:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.344398 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.351034 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.368884 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.389578 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.406361 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.413629 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.413729 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.413781 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.413807 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.413826 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:42Z","lastTransitionTime":"2025-12-01T19:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.423831 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.440169 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.454922 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.471411 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.484350 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.499803 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.516747 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.516796 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.516806 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.516822 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.516837 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:42Z","lastTransitionTime":"2025-12-01T19:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.519618 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.537230 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.557636 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.587192 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.610278 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.619025 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.619076 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.619089 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.619107 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.619117 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:42Z","lastTransitionTime":"2025-12-01T19:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.630249 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.662509 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.686625 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.705468 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.719300 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.719300 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:56:42 crc kubenswrapper[4802]: E1201 19:56:42.719504 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:56:42 crc kubenswrapper[4802]: E1201 19:56:42.719787 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.722183 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.722243 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.722253 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.722266 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.722278 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:42Z","lastTransitionTime":"2025-12-01T19:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.728896 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.749833 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.762741 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.777583 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.791457 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.805429 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.818033 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.824560 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.824744 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.824829 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.824919 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.824997 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:42Z","lastTransitionTime":"2025-12-01T19:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.838952 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.928864 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.929167 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.929300 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.929411 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.929521 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:42Z","lastTransitionTime":"2025-12-01T19:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.937298 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" event={"ID":"933fb25a-a01a-464e-838a-df1d07bca99e","Type":"ContainerStarted","Data":"55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2"} Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.940732 4802 generic.go:334] "Generic (PLEG): container finished" podID="3fb49bb8-3d0a-4ff5-80bf-60c34f310345" containerID="a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202" exitCode=0 Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.940836 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" event={"ID":"3fb49bb8-3d0a-4ff5-80bf-60c34f310345","Type":"ContainerDied","Data":"a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202"} Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.969427 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:42 crc kubenswrapper[4802]: I1201 19:56:42.993783 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.019024 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:43Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.032964 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.033008 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.033020 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.033039 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.033052 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:43Z","lastTransitionTime":"2025-12-01T19:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.037049 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:43Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.057369 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:43Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.074501 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:43Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.090038 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:43Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.101183 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:43Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.106079 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-gp8pz"] Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.106669 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gp8pz" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.108809 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.109421 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.109833 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.110042 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.113164 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sklb8\" (UniqueName: \"kubernetes.io/projected/22a5a396-7caa-46ff-8456-3f6eb84db887-kube-api-access-sklb8\") pod \"node-ca-gp8pz\" (UID: \"22a5a396-7caa-46ff-8456-3f6eb84db887\") " pod="openshift-image-registry/node-ca-gp8pz" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.113242 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22a5a396-7caa-46ff-8456-3f6eb84db887-host\") pod \"node-ca-gp8pz\" (UID: \"22a5a396-7caa-46ff-8456-3f6eb84db887\") " pod="openshift-image-registry/node-ca-gp8pz" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.113284 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/22a5a396-7caa-46ff-8456-3f6eb84db887-serviceca\") pod \"node-ca-gp8pz\" (UID: \"22a5a396-7caa-46ff-8456-3f6eb84db887\") " pod="openshift-image-registry/node-ca-gp8pz" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.117666 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:43Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.131281 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:43Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.135262 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.135294 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.135306 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.135325 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.135341 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:43Z","lastTransitionTime":"2025-12-01T19:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.142995 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:43Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.154186 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:43Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.168302 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:43Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.179455 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:43Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.190297 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:43Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.204615 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:43Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.214299 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sklb8\" (UniqueName: \"kubernetes.io/projected/22a5a396-7caa-46ff-8456-3f6eb84db887-kube-api-access-sklb8\") pod \"node-ca-gp8pz\" (UID: \"22a5a396-7caa-46ff-8456-3f6eb84db887\") " pod="openshift-image-registry/node-ca-gp8pz" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.214344 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22a5a396-7caa-46ff-8456-3f6eb84db887-host\") pod \"node-ca-gp8pz\" (UID: \"22a5a396-7caa-46ff-8456-3f6eb84db887\") " pod="openshift-image-registry/node-ca-gp8pz" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.214359 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/22a5a396-7caa-46ff-8456-3f6eb84db887-serviceca\") pod \"node-ca-gp8pz\" (UID: \"22a5a396-7caa-46ff-8456-3f6eb84db887\") " pod="openshift-image-registry/node-ca-gp8pz" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.215096 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22a5a396-7caa-46ff-8456-3f6eb84db887-host\") pod \"node-ca-gp8pz\" (UID: \"22a5a396-7caa-46ff-8456-3f6eb84db887\") " pod="openshift-image-registry/node-ca-gp8pz" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.215254 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/22a5a396-7caa-46ff-8456-3f6eb84db887-serviceca\") pod \"node-ca-gp8pz\" (UID: \"22a5a396-7caa-46ff-8456-3f6eb84db887\") " pod="openshift-image-registry/node-ca-gp8pz" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.218020 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:43Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.230340 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:43Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.245060 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sklb8\" (UniqueName: \"kubernetes.io/projected/22a5a396-7caa-46ff-8456-3f6eb84db887-kube-api-access-sklb8\") pod \"node-ca-gp8pz\" (UID: \"22a5a396-7caa-46ff-8456-3f6eb84db887\") " pod="openshift-image-registry/node-ca-gp8pz" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.245887 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:43Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.246038 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.246128 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.246209 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.246274 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.246342 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:43Z","lastTransitionTime":"2025-12-01T19:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.260628 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:43Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.278100 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:43Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.289941 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:43Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.307165 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:43Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.323862 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:43Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.345409 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:43Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.349159 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.349419 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.349617 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.349757 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.349872 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:43Z","lastTransitionTime":"2025-12-01T19:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.360560 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:43Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.403742 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:43Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.423331 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gp8pz" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.452374 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.452481 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.452538 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.452594 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.452646 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:43Z","lastTransitionTime":"2025-12-01T19:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.574228 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.574467 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.574529 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.574589 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.574646 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:43Z","lastTransitionTime":"2025-12-01T19:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.676905 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.677095 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.677181 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.677256 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.677310 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:43Z","lastTransitionTime":"2025-12-01T19:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.718935 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:56:43 crc kubenswrapper[4802]: E1201 19:56:43.719480 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:56:43 crc kubenswrapper[4802]: W1201 19:56:43.770160 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22a5a396_7caa_46ff_8456_3f6eb84db887.slice/crio-d20d09d8d8542a109bba1f04854c4cbd179b4f54920d3ccff5135bc26bb83b9a WatchSource:0}: Error finding container d20d09d8d8542a109bba1f04854c4cbd179b4f54920d3ccff5135bc26bb83b9a: Status 404 returned error can't find the container with id d20d09d8d8542a109bba1f04854c4cbd179b4f54920d3ccff5135bc26bb83b9a Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.780361 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.780398 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.780408 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.780422 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.780432 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:43Z","lastTransitionTime":"2025-12-01T19:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.882976 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.883368 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.883387 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.883414 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.883431 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:43Z","lastTransitionTime":"2025-12-01T19:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.944965 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gp8pz" event={"ID":"22a5a396-7caa-46ff-8456-3f6eb84db887","Type":"ContainerStarted","Data":"d20d09d8d8542a109bba1f04854c4cbd179b4f54920d3ccff5135bc26bb83b9a"} Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.948646 4802 generic.go:334] "Generic (PLEG): container finished" podID="3fb49bb8-3d0a-4ff5-80bf-60c34f310345" containerID="4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca" exitCode=0 Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.948698 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" event={"ID":"3fb49bb8-3d0a-4ff5-80bf-60c34f310345","Type":"ContainerDied","Data":"4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca"} Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.974021 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:43Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.987930 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.987965 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.987974 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.988007 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.988018 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:43Z","lastTransitionTime":"2025-12-01T19:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:43 crc kubenswrapper[4802]: I1201 19:56:43.993059 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:43Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.012786 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:44Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.023987 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:44Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.035756 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:44Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.047345 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:44Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.058134 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:44Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.076897 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:44Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.090751 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.090806 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.090816 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.090836 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.090849 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:44Z","lastTransitionTime":"2025-12-01T19:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.094505 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:44Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.107707 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:44Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.122931 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:44Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.137051 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:44Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.158942 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:44Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.172376 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:44Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.192850 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.192892 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.192905 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.192921 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.192934 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:44Z","lastTransitionTime":"2025-12-01T19:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.295833 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.295891 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.295902 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.295916 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.295925 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:44Z","lastTransitionTime":"2025-12-01T19:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.323549 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.323651 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.323700 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:56:44 crc kubenswrapper[4802]: E1201 19:56:44.323808 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:56:52.323767333 +0000 UTC m=+33.886327014 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:56:44 crc kubenswrapper[4802]: E1201 19:56:44.323832 4802 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 19:56:44 crc kubenswrapper[4802]: E1201 19:56:44.323880 4802 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 19:56:44 crc kubenswrapper[4802]: E1201 19:56:44.323906 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 19:56:52.323887337 +0000 UTC m=+33.886446978 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 19:56:44 crc kubenswrapper[4802]: E1201 19:56:44.323996 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 19:56:52.323967719 +0000 UTC m=+33.886527390 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.399383 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.399420 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.399429 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.399448 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.399457 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:44Z","lastTransitionTime":"2025-12-01T19:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.424314 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.424390 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:56:44 crc kubenswrapper[4802]: E1201 19:56:44.424557 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 19:56:44 crc kubenswrapper[4802]: E1201 19:56:44.424615 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 19:56:44 crc kubenswrapper[4802]: E1201 19:56:44.424642 4802 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:56:44 crc kubenswrapper[4802]: E1201 19:56:44.424569 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 19:56:44 crc kubenswrapper[4802]: E1201 19:56:44.424713 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 19:56:44 crc kubenswrapper[4802]: E1201 19:56:44.424729 4802 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:56:44 crc kubenswrapper[4802]: E1201 19:56:44.424760 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 19:56:52.42471449 +0000 UTC m=+33.987274281 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:56:44 crc kubenswrapper[4802]: E1201 19:56:44.424811 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 19:56:52.424789362 +0000 UTC m=+33.987349173 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.501659 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.501755 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.501776 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.501803 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.501822 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:44Z","lastTransitionTime":"2025-12-01T19:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.604335 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.604388 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.604404 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.604427 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.604444 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:44Z","lastTransitionTime":"2025-12-01T19:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.707363 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.707408 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.707417 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.707431 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.707440 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:44Z","lastTransitionTime":"2025-12-01T19:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.719321 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.719338 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:56:44 crc kubenswrapper[4802]: E1201 19:56:44.719439 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:56:44 crc kubenswrapper[4802]: E1201 19:56:44.719541 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.809650 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.809696 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.809709 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.809727 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.809738 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:44Z","lastTransitionTime":"2025-12-01T19:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.912117 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.912170 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.912186 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.912243 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.912260 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:44Z","lastTransitionTime":"2025-12-01T19:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.956216 4802 generic.go:334] "Generic (PLEG): container finished" podID="3fb49bb8-3d0a-4ff5-80bf-60c34f310345" containerID="5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c" exitCode=0 Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.956286 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" event={"ID":"3fb49bb8-3d0a-4ff5-80bf-60c34f310345","Type":"ContainerDied","Data":"5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c"} Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.957961 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gp8pz" event={"ID":"22a5a396-7caa-46ff-8456-3f6eb84db887","Type":"ContainerStarted","Data":"c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7"} Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.972866 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:44Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:44 crc kubenswrapper[4802]: I1201 19:56:44.993018 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:44Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.016450 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:45Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.021306 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.021343 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.021351 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.021367 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.021376 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:45Z","lastTransitionTime":"2025-12-01T19:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.037607 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:45Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.058673 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:45Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.074512 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:45Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.088107 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:45Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.098098 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:45Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.107340 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:45Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.116259 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:45Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.123869 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.123894 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.123904 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.123916 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.123925 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:45Z","lastTransitionTime":"2025-12-01T19:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.125322 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:45Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.134403 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:45Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.145708 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:45Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.158782 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:45Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.172383 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:45Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.184189 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:45Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.195097 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:45Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.207010 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:45Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.217579 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:45Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.226343 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.226427 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.226448 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.226471 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.226487 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:45Z","lastTransitionTime":"2025-12-01T19:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.230509 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:45Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.242881 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:45Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.252269 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:45Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.263628 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:45Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.275645 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:45Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.287372 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:45Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.301560 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:45Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.313016 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:45Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.329102 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:45Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.329398 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.329732 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.329755 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.329784 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.329806 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:45Z","lastTransitionTime":"2025-12-01T19:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.433068 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.433120 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.433135 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.433154 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.433165 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:45Z","lastTransitionTime":"2025-12-01T19:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.535836 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.535875 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.535886 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.535903 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.535912 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:45Z","lastTransitionTime":"2025-12-01T19:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.638348 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.638384 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.638397 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.638415 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.638427 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:45Z","lastTransitionTime":"2025-12-01T19:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.719764 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:56:45 crc kubenswrapper[4802]: E1201 19:56:45.719888 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.740897 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.741124 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.741235 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.741338 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.741416 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:45Z","lastTransitionTime":"2025-12-01T19:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.844073 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.844134 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.844155 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.844185 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.844240 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:45Z","lastTransitionTime":"2025-12-01T19:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.946600 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.946640 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.946649 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.946661 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.946670 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:45Z","lastTransitionTime":"2025-12-01T19:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.965986 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" event={"ID":"3fb49bb8-3d0a-4ff5-80bf-60c34f310345","Type":"ContainerStarted","Data":"c255f325ac4e39325605a7f8f0f99237e8efa2630bee7931972504b424d7b87a"} Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.971242 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" event={"ID":"933fb25a-a01a-464e-838a-df1d07bca99e","Type":"ContainerStarted","Data":"7ab25d7b3520b6624a5869f8689067345f266cd508c600ee7007d73bf0b5f6bf"} Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.971593 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.971657 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.971812 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:45 crc kubenswrapper[4802]: I1201 19:56:45.984912 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:45Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.002187 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:45Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.007481 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.007773 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.033651 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:46Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.048712 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.048648 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:46Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.048754 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.048906 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.048938 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.048956 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:46Z","lastTransitionTime":"2025-12-01T19:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.063733 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c255f325ac4e39325605a7f8f0f99237e8efa2630bee7931972504b424d7b87a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:46Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.076814 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:46Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.090431 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:46Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.102176 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:46Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.113822 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:46Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.125798 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:46Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.136319 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:46Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.150048 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:46Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.151907 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.151947 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.151958 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.151981 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.151993 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:46Z","lastTransitionTime":"2025-12-01T19:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.163671 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:46Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.175954 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:46Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.189506 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:46Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.209006 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab25d7b3520b6624a5869f8689067345f266cd508c600ee7007d73bf0b5f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:46Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.223571 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:46Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.238147 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:46Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.253802 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:46Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.255965 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.256003 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.256015 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.256033 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.256046 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:46Z","lastTransitionTime":"2025-12-01T19:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.278849 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:46Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.294399 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:46Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.308485 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:46Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.325377 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c255f325ac4e39325605a7f8f0f99237e8efa2630bee7931972504b424d7b87a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:46Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.338666 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:46Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.352464 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:46Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.359270 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.359318 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.359335 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.359359 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.359375 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:46Z","lastTransitionTime":"2025-12-01T19:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.372733 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:46Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.388833 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:46Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.408282 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:46Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.462215 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.462281 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.462294 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.462311 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.462345 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:46Z","lastTransitionTime":"2025-12-01T19:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.565005 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.565044 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.565056 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.565072 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.565085 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:46Z","lastTransitionTime":"2025-12-01T19:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.667039 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.667099 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.667117 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.667141 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.667161 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:46Z","lastTransitionTime":"2025-12-01T19:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.719906 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.719956 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:56:46 crc kubenswrapper[4802]: E1201 19:56:46.720057 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:56:46 crc kubenswrapper[4802]: E1201 19:56:46.720218 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.769991 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.770050 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.770067 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.770092 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.770110 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:46Z","lastTransitionTime":"2025-12-01T19:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.872836 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.872868 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.872879 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.872898 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.872910 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:46Z","lastTransitionTime":"2025-12-01T19:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.974501 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.974534 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.974542 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.974558 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:46 crc kubenswrapper[4802]: I1201 19:56:46.974568 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:46Z","lastTransitionTime":"2025-12-01T19:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.077376 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.077436 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.077453 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.077478 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.077498 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:47Z","lastTransitionTime":"2025-12-01T19:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.180099 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.180139 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.180148 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.180164 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.180176 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:47Z","lastTransitionTime":"2025-12-01T19:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.282948 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.282997 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.283011 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.283030 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.283046 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:47Z","lastTransitionTime":"2025-12-01T19:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.385984 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.386017 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.386027 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.386042 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.386053 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:47Z","lastTransitionTime":"2025-12-01T19:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.488357 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.488406 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.488424 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.488445 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.488459 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:47Z","lastTransitionTime":"2025-12-01T19:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.590756 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.590823 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.590843 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.590869 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.590889 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:47Z","lastTransitionTime":"2025-12-01T19:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.694575 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.694706 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.694737 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.694777 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.694804 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:47Z","lastTransitionTime":"2025-12-01T19:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.720034 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:56:47 crc kubenswrapper[4802]: E1201 19:56:47.720321 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.798536 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.798602 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.798623 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.798648 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.798665 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:47Z","lastTransitionTime":"2025-12-01T19:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.902844 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.902914 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.902941 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.902974 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.902994 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:47Z","lastTransitionTime":"2025-12-01T19:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.980907 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7nr2_933fb25a-a01a-464e-838a-df1d07bca99e/ovnkube-controller/0.log" Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.984771 4802 generic.go:334] "Generic (PLEG): container finished" podID="933fb25a-a01a-464e-838a-df1d07bca99e" containerID="7ab25d7b3520b6624a5869f8689067345f266cd508c600ee7007d73bf0b5f6bf" exitCode=1 Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.984829 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" event={"ID":"933fb25a-a01a-464e-838a-df1d07bca99e","Type":"ContainerDied","Data":"7ab25d7b3520b6624a5869f8689067345f266cd508c600ee7007d73bf0b5f6bf"} Dec 01 19:56:47 crc kubenswrapper[4802]: I1201 19:56:47.986164 4802 scope.go:117] "RemoveContainer" containerID="7ab25d7b3520b6624a5869f8689067345f266cd508c600ee7007d73bf0b5f6bf" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.000366 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:47Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.012959 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.013002 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.013017 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.013034 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.013044 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:48Z","lastTransitionTime":"2025-12-01T19:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.020479 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.036729 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c255f325ac4e39325605a7f8f0f99237e8efa2630bee7931972504b424d7b87a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.050735 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.062319 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.073493 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.084232 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.095743 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.104407 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.115018 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.115049 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.115059 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.115073 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.115083 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:48Z","lastTransitionTime":"2025-12-01T19:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.120805 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.134988 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.151990 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.168515 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.187773 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab25d7b3520b6624a5869f8689067345f266cd508c600ee7007d73bf0b5f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab25d7b3520b6624a5869f8689067345f266cd508c600ee7007d73bf0b5f6bf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:56:47Z\\\",\\\"message\\\":\\\"or removal\\\\nI1201 19:56:47.225989 6104 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 19:56:47.226021 6104 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 19:56:47.226029 6104 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 19:56:47.226089 6104 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 19:56:47.226103 6104 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 19:56:47.226145 6104 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 19:56:47.226172 6104 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 19:56:47.226186 6104 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 19:56:47.226183 6104 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 19:56:47.226235 6104 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 19:56:47.226189 6104 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 19:56:47.226270 6104 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 19:56:47.226301 6104 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 19:56:47.226380 6104 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 19:56:47.226459 6104 factory.go:656] Stopping watch factory\\\\nI1201 19:56:47.226499 6104 ovnkube.go:599] Stopped ovnkube\\\\nI1201 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.218592 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.218628 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.218639 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.218654 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.218665 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:48Z","lastTransitionTime":"2025-12-01T19:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.321383 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.321452 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.321470 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.321499 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.321516 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:48Z","lastTransitionTime":"2025-12-01T19:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.423658 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.423701 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.423712 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.423730 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.423739 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:48Z","lastTransitionTime":"2025-12-01T19:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.527186 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.527251 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.527264 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.527286 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.527299 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:48Z","lastTransitionTime":"2025-12-01T19:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.629462 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.629506 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.629517 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.629534 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.629543 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:48Z","lastTransitionTime":"2025-12-01T19:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.719852 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.719893 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:56:48 crc kubenswrapper[4802]: E1201 19:56:48.719975 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:56:48 crc kubenswrapper[4802]: E1201 19:56:48.720017 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.729026 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.731091 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.731122 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.731134 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.731148 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.731159 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:48Z","lastTransitionTime":"2025-12-01T19:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.738506 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.749383 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.759999 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.770148 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.780429 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.790082 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.813432 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ab25d7b3520b6624a5869f8689067345f266cd508c600ee7007d73bf0b5f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab25d7b3520b6624a5869f8689067345f266cd508c600ee7007d73bf0b5f6bf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:56:47Z\\\",\\\"message\\\":\\\"or removal\\\\nI1201 19:56:47.225989 6104 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 19:56:47.226021 6104 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 19:56:47.226029 6104 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 19:56:47.226089 6104 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 19:56:47.226103 6104 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 19:56:47.226145 6104 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 19:56:47.226172 6104 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 19:56:47.226186 6104 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 19:56:47.226183 6104 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 19:56:47.226235 6104 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 19:56:47.226189 6104 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 19:56:47.226270 6104 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 19:56:47.226301 6104 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 19:56:47.226380 6104 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 19:56:47.226459 6104 factory.go:656] Stopping watch factory\\\\nI1201 19:56:47.226499 6104 ovnkube.go:599] Stopped ovnkube\\\\nI1201 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.825973 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.833342 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.833377 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.833385 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.833399 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.833408 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:48Z","lastTransitionTime":"2025-12-01T19:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.840477 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.853162 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.864803 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.877946 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.900222 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c255f325ac4e39325605a7f8f0f99237e8efa2630bee7931972504b424d7b87a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.934960 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.935012 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.935029 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.935057 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.935079 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:48Z","lastTransitionTime":"2025-12-01T19:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.990480 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7nr2_933fb25a-a01a-464e-838a-df1d07bca99e/ovnkube-controller/0.log" Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.994750 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" event={"ID":"933fb25a-a01a-464e-838a-df1d07bca99e","Type":"ContainerStarted","Data":"c5ae9cd123a33e86663401b0a8bffb6835e13283b857a173f3528ed8da869507"} Dec 01 19:56:48 crc kubenswrapper[4802]: I1201 19:56:48.995511 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.006674 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:49Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.016909 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:49Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.030261 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:49Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.038596 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.038626 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.038636 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.038650 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.038693 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:49Z","lastTransitionTime":"2025-12-01T19:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.044999 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:49Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.061469 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:49Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.076770 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:49Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.089766 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:49Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.104718 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:49Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.125076 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5ae9cd123a33e86663401b0a8bffb6835e13283b857a173f3528ed8da869507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab25d7b3520b6624a5869f8689067345f266cd508c600ee7007d73bf0b5f6bf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:56:47Z\\\",\\\"message\\\":\\\"or removal\\\\nI1201 19:56:47.225989 6104 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 19:56:47.226021 6104 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 19:56:47.226029 6104 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 19:56:47.226089 6104 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 19:56:47.226103 6104 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 19:56:47.226145 6104 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 19:56:47.226172 6104 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 19:56:47.226186 6104 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 19:56:47.226183 6104 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 19:56:47.226235 6104 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 19:56:47.226189 6104 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 19:56:47.226270 6104 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 19:56:47.226301 6104 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 19:56:47.226380 6104 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 19:56:47.226459 6104 factory.go:656] Stopping watch factory\\\\nI1201 19:56:47.226499 6104 ovnkube.go:599] Stopped ovnkube\\\\nI1201 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:49Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.138737 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:49Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.140732 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.140764 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.140775 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.140793 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.140807 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:49Z","lastTransitionTime":"2025-12-01T19:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.154430 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c255f325ac4e39325605a7f8f0f99237e8efa2630bee7931972504b424d7b87a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:49Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.170972 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:49Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.186437 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:49Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.198973 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:49Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.243477 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.243505 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.243513 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.243528 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.243537 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:49Z","lastTransitionTime":"2025-12-01T19:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.351286 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.351839 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.351868 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.351901 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.351924 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:49Z","lastTransitionTime":"2025-12-01T19:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.454364 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.454415 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.454433 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.454458 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.454475 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:49Z","lastTransitionTime":"2025-12-01T19:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.556834 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.556893 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.556904 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.556921 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.556934 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:49Z","lastTransitionTime":"2025-12-01T19:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.659072 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.659108 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.659117 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.659131 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.659140 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:49Z","lastTransitionTime":"2025-12-01T19:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.719503 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:56:49 crc kubenswrapper[4802]: E1201 19:56:49.719660 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.761915 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.762034 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.762048 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.762071 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.762083 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:49Z","lastTransitionTime":"2025-12-01T19:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.864917 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.864965 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.864977 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.864993 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.865007 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:49Z","lastTransitionTime":"2025-12-01T19:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.967598 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.967670 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.967686 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.967711 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:49 crc kubenswrapper[4802]: I1201 19:56:49.967729 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:49Z","lastTransitionTime":"2025-12-01T19:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.070337 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.070397 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.070414 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.070437 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.070454 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:50Z","lastTransitionTime":"2025-12-01T19:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.172549 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.172591 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.172603 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.172620 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.172631 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:50Z","lastTransitionTime":"2025-12-01T19:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.275547 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.275603 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.275622 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.275644 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.275661 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:50Z","lastTransitionTime":"2025-12-01T19:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.324796 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.324858 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.324871 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.324893 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.324906 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:50Z","lastTransitionTime":"2025-12-01T19:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:50 crc kubenswrapper[4802]: E1201 19:56:50.349858 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:50Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.356558 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.356605 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.356617 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.356636 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.356649 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:50Z","lastTransitionTime":"2025-12-01T19:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:50 crc kubenswrapper[4802]: E1201 19:56:50.377145 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:50Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.382095 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.382153 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.382171 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.382235 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.382256 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:50Z","lastTransitionTime":"2025-12-01T19:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:50 crc kubenswrapper[4802]: E1201 19:56:50.402744 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:50Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.408534 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.408610 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.408638 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.408675 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.408700 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:50Z","lastTransitionTime":"2025-12-01T19:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:50 crc kubenswrapper[4802]: E1201 19:56:50.424470 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:50Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.430428 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.430501 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.430520 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.430548 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.430566 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:50Z","lastTransitionTime":"2025-12-01T19:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:50 crc kubenswrapper[4802]: E1201 19:56:50.445415 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:50Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:50 crc kubenswrapper[4802]: E1201 19:56:50.445670 4802 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.447989 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.448070 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.448099 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.448135 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.448166 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:50Z","lastTransitionTime":"2025-12-01T19:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.551042 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.551148 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.551398 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.551438 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.551460 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:50Z","lastTransitionTime":"2025-12-01T19:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.654161 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.654235 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.654246 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.654264 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.654306 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:50Z","lastTransitionTime":"2025-12-01T19:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.719320 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.719372 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:56:50 crc kubenswrapper[4802]: E1201 19:56:50.719467 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:56:50 crc kubenswrapper[4802]: E1201 19:56:50.719544 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.757417 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.757491 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.757510 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.757534 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.757549 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:50Z","lastTransitionTime":"2025-12-01T19:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.860775 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.860848 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.860868 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.860897 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.860917 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:50Z","lastTransitionTime":"2025-12-01T19:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.963808 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.963889 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.963921 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.963946 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:50 crc kubenswrapper[4802]: I1201 19:56:50.963967 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:50Z","lastTransitionTime":"2025-12-01T19:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.067271 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.067335 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.067351 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.067376 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.067396 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:51Z","lastTransitionTime":"2025-12-01T19:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.170442 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.170531 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.170550 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.171047 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.171106 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:51Z","lastTransitionTime":"2025-12-01T19:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.274592 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.274638 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.274649 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.274665 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.274678 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:51Z","lastTransitionTime":"2025-12-01T19:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.377479 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.377560 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.377585 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.377617 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.377643 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:51Z","lastTransitionTime":"2025-12-01T19:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.480033 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.480076 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.480087 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.480102 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.480114 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:51Z","lastTransitionTime":"2025-12-01T19:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.582384 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.582469 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.582489 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.582518 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.582536 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:51Z","lastTransitionTime":"2025-12-01T19:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.685638 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.685677 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.685687 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.685703 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.685713 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:51Z","lastTransitionTime":"2025-12-01T19:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.719579 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:56:51 crc kubenswrapper[4802]: E1201 19:56:51.719800 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.783291 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk"] Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.783930 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.786634 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.786752 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.787882 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.787926 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.787943 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.787965 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.787985 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:51Z","lastTransitionTime":"2025-12-01T19:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.802552 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:51Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.803096 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-cknrk\" (UID: \"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.803179 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-cknrk\" (UID: \"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.803243 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2d4w\" (UniqueName: \"kubernetes.io/projected/a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f-kube-api-access-l2d4w\") pod \"ovnkube-control-plane-749d76644c-cknrk\" (UID: \"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.803337 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-cknrk\" (UID: \"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.820312 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:51Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.839898 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:51Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.854735 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:51Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.873297 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:51Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.891758 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.891807 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.891818 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.891856 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.891869 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:51Z","lastTransitionTime":"2025-12-01T19:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.896038 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5ae9cd123a33e86663401b0a8bffb6835e13283b857a173f3528ed8da869507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab25d7b3520b6624a5869f8689067345f266cd508c600ee7007d73bf0b5f6bf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:56:47Z\\\",\\\"message\\\":\\\"or removal\\\\nI1201 19:56:47.225989 6104 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 19:56:47.226021 6104 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 19:56:47.226029 6104 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 19:56:47.226089 6104 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 19:56:47.226103 6104 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 19:56:47.226145 6104 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 19:56:47.226172 6104 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 19:56:47.226186 6104 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 19:56:47.226183 6104 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 19:56:47.226235 6104 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 19:56:47.226189 6104 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 19:56:47.226270 6104 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 19:56:47.226301 6104 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 19:56:47.226380 6104 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 19:56:47.226459 6104 factory.go:656] Stopping watch factory\\\\nI1201 19:56:47.226499 6104 ovnkube.go:599] Stopped ovnkube\\\\nI1201 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:51Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.903945 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-cknrk\" (UID: \"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.904021 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-cknrk\" (UID: \"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.904044 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2d4w\" (UniqueName: \"kubernetes.io/projected/a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f-kube-api-access-l2d4w\") pod \"ovnkube-control-plane-749d76644c-cknrk\" (UID: \"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.904092 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-cknrk\" (UID: \"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.904775 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-cknrk\" (UID: \"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.904878 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-cknrk\" (UID: \"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.910263 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-cknrk\" (UID: \"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.917988 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:51Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.921826 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2d4w\" (UniqueName: \"kubernetes.io/projected/a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f-kube-api-access-l2d4w\") pod \"ovnkube-control-plane-749d76644c-cknrk\" (UID: \"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.934436 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:51Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.945461 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:51Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.961383 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c255f325ac4e39325605a7f8f0f99237e8efa2630bee7931972504b424d7b87a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:51Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.973341 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:51Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.985350 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:51Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.994145 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.994347 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.994465 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.994580 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.994673 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:51Z","lastTransitionTime":"2025-12-01T19:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:51 crc kubenswrapper[4802]: I1201 19:56:51.995067 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:51Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.006878 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.018509 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.097036 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.097083 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.097096 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.097114 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.097126 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:52Z","lastTransitionTime":"2025-12-01T19:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.099995 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" Dec 01 19:56:52 crc kubenswrapper[4802]: W1201 19:56:52.114128 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5dd3f54_4b2a_4ae6_9cce_d5ac0e044b0f.slice/crio-706c17bea22ed99e7079aacdcf2115610aaea9b4c4a836fc261788ab17f160fb WatchSource:0}: Error finding container 706c17bea22ed99e7079aacdcf2115610aaea9b4c4a836fc261788ab17f160fb: Status 404 returned error can't find the container with id 706c17bea22ed99e7079aacdcf2115610aaea9b4c4a836fc261788ab17f160fb Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.199751 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.199798 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.199815 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.199837 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.199853 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:52Z","lastTransitionTime":"2025-12-01T19:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.302273 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.302553 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.302566 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.302582 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.302600 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:52Z","lastTransitionTime":"2025-12-01T19:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.405056 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.405101 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.405109 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.405126 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.405135 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:52Z","lastTransitionTime":"2025-12-01T19:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.409465 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.409560 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.409604 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:56:52 crc kubenswrapper[4802]: E1201 19:56:52.409637 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:57:08.409617636 +0000 UTC m=+49.972177277 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:56:52 crc kubenswrapper[4802]: E1201 19:56:52.409680 4802 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 19:56:52 crc kubenswrapper[4802]: E1201 19:56:52.409728 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 19:57:08.409716579 +0000 UTC m=+49.972276220 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 19:56:52 crc kubenswrapper[4802]: E1201 19:56:52.409864 4802 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 19:56:52 crc kubenswrapper[4802]: E1201 19:56:52.410007 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 19:57:08.409978368 +0000 UTC m=+49.972538009 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.508188 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.508256 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.508265 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.508280 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.508290 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:52Z","lastTransitionTime":"2025-12-01T19:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.510635 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.510679 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:56:52 crc kubenswrapper[4802]: E1201 19:56:52.510786 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 19:56:52 crc kubenswrapper[4802]: E1201 19:56:52.510805 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 19:56:52 crc kubenswrapper[4802]: E1201 19:56:52.510808 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 19:56:52 crc kubenswrapper[4802]: E1201 19:56:52.510817 4802 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:56:52 crc kubenswrapper[4802]: E1201 19:56:52.510828 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 19:56:52 crc kubenswrapper[4802]: E1201 19:56:52.510839 4802 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:56:52 crc kubenswrapper[4802]: E1201 19:56:52.510865 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 19:57:08.510852943 +0000 UTC m=+50.073412574 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:56:52 crc kubenswrapper[4802]: E1201 19:56:52.510898 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 19:57:08.510877094 +0000 UTC m=+50.073436785 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.610530 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.610563 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.610571 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.610587 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.610596 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:52Z","lastTransitionTime":"2025-12-01T19:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.713077 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.713122 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.713134 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.713151 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.713160 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:52Z","lastTransitionTime":"2025-12-01T19:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.719450 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.719465 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:56:52 crc kubenswrapper[4802]: E1201 19:56:52.719559 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:56:52 crc kubenswrapper[4802]: E1201 19:56:52.719628 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.815355 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.815405 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.815417 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.815435 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.815450 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:52Z","lastTransitionTime":"2025-12-01T19:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.885587 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-p8cs7"] Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.886226 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:56:52 crc kubenswrapper[4802]: E1201 19:56:52.886314 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.901323 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.913815 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008be62d-2cef-42a3-912f-2b2e58f8e30b-metrics-certs\") pod \"network-metrics-daemon-p8cs7\" (UID: \"008be62d-2cef-42a3-912f-2b2e58f8e30b\") " pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.913894 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n95sp\" (UniqueName: \"kubernetes.io/projected/008be62d-2cef-42a3-912f-2b2e58f8e30b-kube-api-access-n95sp\") pod \"network-metrics-daemon-p8cs7\" (UID: \"008be62d-2cef-42a3-912f-2b2e58f8e30b\") " pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.915483 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.916961 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.916990 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.917001 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.917018 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.917029 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:52Z","lastTransitionTime":"2025-12-01T19:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.928534 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.945590 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c255f325ac4e39325605a7f8f0f99237e8efa2630bee7931972504b424d7b87a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.958036 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8cs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008be62d-2cef-42a3-912f-2b2e58f8e30b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8cs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.970013 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.983473 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:52 crc kubenswrapper[4802]: I1201 19:56:52.998065 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.008314 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" event={"ID":"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f","Type":"ContainerStarted","Data":"eaa0e17f47aef6ddde23ea47f070a3015546fd59799ea48e6618eea9f9abce64"} Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.008396 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" event={"ID":"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f","Type":"ContainerStarted","Data":"8602c3d5a6475cdb06fb490a90a96ce1a6cf7bfbf1d3142c5b861de2838e0e8d"} Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.008420 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" event={"ID":"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f","Type":"ContainerStarted","Data":"706c17bea22ed99e7079aacdcf2115610aaea9b4c4a836fc261788ab17f160fb"} Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.013188 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.014618 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n95sp\" (UniqueName: \"kubernetes.io/projected/008be62d-2cef-42a3-912f-2b2e58f8e30b-kube-api-access-n95sp\") pod \"network-metrics-daemon-p8cs7\" (UID: \"008be62d-2cef-42a3-912f-2b2e58f8e30b\") " pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.014708 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008be62d-2cef-42a3-912f-2b2e58f8e30b-metrics-certs\") pod \"network-metrics-daemon-p8cs7\" (UID: \"008be62d-2cef-42a3-912f-2b2e58f8e30b\") " pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:56:53 crc kubenswrapper[4802]: E1201 19:56:53.014838 4802 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 19:56:53 crc kubenswrapper[4802]: E1201 19:56:53.014911 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/008be62d-2cef-42a3-912f-2b2e58f8e30b-metrics-certs podName:008be62d-2cef-42a3-912f-2b2e58f8e30b nodeName:}" failed. No retries permitted until 2025-12-01 19:56:53.514889304 +0000 UTC m=+35.077448975 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/008be62d-2cef-42a3-912f-2b2e58f8e30b-metrics-certs") pod "network-metrics-daemon-p8cs7" (UID: "008be62d-2cef-42a3-912f-2b2e58f8e30b") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.018953 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.019027 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.019053 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.019085 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.019108 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:53Z","lastTransitionTime":"2025-12-01T19:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.027007 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.035790 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n95sp\" (UniqueName: \"kubernetes.io/projected/008be62d-2cef-42a3-912f-2b2e58f8e30b-kube-api-access-n95sp\") pod \"network-metrics-daemon-p8cs7\" (UID: \"008be62d-2cef-42a3-912f-2b2e58f8e30b\") " pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.043117 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.060066 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.077889 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.093695 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.110527 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.121646 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.121685 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.121703 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.121770 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.121787 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:53Z","lastTransitionTime":"2025-12-01T19:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.140704 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5ae9cd123a33e86663401b0a8bffb6835e13283b857a173f3528ed8da869507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab25d7b3520b6624a5869f8689067345f266cd508c600ee7007d73bf0b5f6bf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:56:47Z\\\",\\\"message\\\":\\\"or removal\\\\nI1201 19:56:47.225989 6104 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 19:56:47.226021 6104 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 19:56:47.226029 6104 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 19:56:47.226089 6104 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 19:56:47.226103 6104 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 19:56:47.226145 6104 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 19:56:47.226172 6104 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 19:56:47.226186 6104 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 19:56:47.226183 6104 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 19:56:47.226235 6104 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 19:56:47.226189 6104 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 19:56:47.226270 6104 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 19:56:47.226301 6104 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 19:56:47.226380 6104 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 19:56:47.226459 6104 factory.go:656] Stopping watch factory\\\\nI1201 19:56:47.226499 6104 ovnkube.go:599] Stopped ovnkube\\\\nI1201 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.154245 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.170425 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8602c3d5a6475cdb06fb490a90a96ce1a6cf7bfbf1d3142c5b861de2838e0e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa0e17f47aef6ddde23ea47f070a3015546fd59799ea48e6618eea9f9abce64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.188607 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.206817 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.223608 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.224107 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.224264 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.224424 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.224552 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:53Z","lastTransitionTime":"2025-12-01T19:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.235130 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5ae9cd123a33e86663401b0a8bffb6835e13283b857a173f3528ed8da869507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab25d7b3520b6624a5869f8689067345f266cd508c600ee7007d73bf0b5f6bf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:56:47Z\\\",\\\"message\\\":\\\"or removal\\\\nI1201 19:56:47.225989 6104 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 19:56:47.226021 6104 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 19:56:47.226029 6104 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 19:56:47.226089 6104 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 19:56:47.226103 6104 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 19:56:47.226145 6104 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 19:56:47.226172 6104 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 19:56:47.226186 6104 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 19:56:47.226183 6104 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 19:56:47.226235 6104 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 19:56:47.226189 6104 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 19:56:47.226270 6104 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 19:56:47.226301 6104 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 19:56:47.226380 6104 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 19:56:47.226459 6104 factory.go:656] Stopping watch factory\\\\nI1201 19:56:47.226499 6104 ovnkube.go:599] Stopped ovnkube\\\\nI1201 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.251966 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.268177 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c255f325ac4e39325605a7f8f0f99237e8efa2630bee7931972504b424d7b87a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.280055 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.295931 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.312441 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.326631 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.326705 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.326730 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.326760 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.326783 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:53Z","lastTransitionTime":"2025-12-01T19:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.326889 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.340838 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.354703 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.366241 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8cs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008be62d-2cef-42a3-912f-2b2e58f8e30b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8cs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.386489 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.398954 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.429272 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.429329 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.429346 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.429372 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.429391 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:53Z","lastTransitionTime":"2025-12-01T19:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.519133 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008be62d-2cef-42a3-912f-2b2e58f8e30b-metrics-certs\") pod \"network-metrics-daemon-p8cs7\" (UID: \"008be62d-2cef-42a3-912f-2b2e58f8e30b\") " pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:56:53 crc kubenswrapper[4802]: E1201 19:56:53.519305 4802 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 19:56:53 crc kubenswrapper[4802]: E1201 19:56:53.519364 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/008be62d-2cef-42a3-912f-2b2e58f8e30b-metrics-certs podName:008be62d-2cef-42a3-912f-2b2e58f8e30b nodeName:}" failed. No retries permitted until 2025-12-01 19:56:54.519347999 +0000 UTC m=+36.081907640 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/008be62d-2cef-42a3-912f-2b2e58f8e30b-metrics-certs") pod "network-metrics-daemon-p8cs7" (UID: "008be62d-2cef-42a3-912f-2b2e58f8e30b") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.531685 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.531712 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.531720 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.531734 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.531743 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:53Z","lastTransitionTime":"2025-12-01T19:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.634698 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.634737 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.634748 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.634763 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.634775 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:53Z","lastTransitionTime":"2025-12-01T19:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.719902 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:56:53 crc kubenswrapper[4802]: E1201 19:56:53.720856 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.737836 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.737928 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.737938 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.737952 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.737960 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:53Z","lastTransitionTime":"2025-12-01T19:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.841465 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.841523 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.841542 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.841569 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.841589 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:53Z","lastTransitionTime":"2025-12-01T19:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.945441 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.945504 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.945520 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.945543 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:53 crc kubenswrapper[4802]: I1201 19:56:53.946023 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:53Z","lastTransitionTime":"2025-12-01T19:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.050372 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.050426 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.050442 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.050463 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.050478 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:54Z","lastTransitionTime":"2025-12-01T19:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.153770 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.153835 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.153853 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.153893 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.153913 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:54Z","lastTransitionTime":"2025-12-01T19:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.256861 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.256919 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.256940 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.256965 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.256984 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:54Z","lastTransitionTime":"2025-12-01T19:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.359565 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.359623 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.359642 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.359666 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.359683 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:54Z","lastTransitionTime":"2025-12-01T19:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.462490 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.462535 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.462554 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.462577 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.462594 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:54Z","lastTransitionTime":"2025-12-01T19:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.528529 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008be62d-2cef-42a3-912f-2b2e58f8e30b-metrics-certs\") pod \"network-metrics-daemon-p8cs7\" (UID: \"008be62d-2cef-42a3-912f-2b2e58f8e30b\") " pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:56:54 crc kubenswrapper[4802]: E1201 19:56:54.528699 4802 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 19:56:54 crc kubenswrapper[4802]: E1201 19:56:54.528795 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/008be62d-2cef-42a3-912f-2b2e58f8e30b-metrics-certs podName:008be62d-2cef-42a3-912f-2b2e58f8e30b nodeName:}" failed. No retries permitted until 2025-12-01 19:56:56.528766556 +0000 UTC m=+38.091326237 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/008be62d-2cef-42a3-912f-2b2e58f8e30b-metrics-certs") pod "network-metrics-daemon-p8cs7" (UID: "008be62d-2cef-42a3-912f-2b2e58f8e30b") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.565480 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.565519 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.565535 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.565556 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.565572 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:54Z","lastTransitionTime":"2025-12-01T19:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.668843 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.668890 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.668907 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.668932 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.668950 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:54Z","lastTransitionTime":"2025-12-01T19:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.719346 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.719474 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.719374 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:56:54 crc kubenswrapper[4802]: E1201 19:56:54.719613 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:56:54 crc kubenswrapper[4802]: E1201 19:56:54.719744 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:56:54 crc kubenswrapper[4802]: E1201 19:56:54.719889 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.772360 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.772414 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.772425 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.772448 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.772460 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:54Z","lastTransitionTime":"2025-12-01T19:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.875482 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.875541 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.875604 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.875632 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.875649 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:54Z","lastTransitionTime":"2025-12-01T19:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.979314 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.979370 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.979382 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.979401 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:54 crc kubenswrapper[4802]: I1201 19:56:54.979415 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:54Z","lastTransitionTime":"2025-12-01T19:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.082542 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.082624 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.082651 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.082682 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.082700 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:55Z","lastTransitionTime":"2025-12-01T19:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.186288 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.186348 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.186371 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.186402 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.186426 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:55Z","lastTransitionTime":"2025-12-01T19:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.293822 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.293886 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.293900 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.293918 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.293931 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:55Z","lastTransitionTime":"2025-12-01T19:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.396287 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.396346 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.396359 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.396380 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.396405 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:55Z","lastTransitionTime":"2025-12-01T19:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.498411 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.498445 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.498452 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.498467 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.498477 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:55Z","lastTransitionTime":"2025-12-01T19:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.602017 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.602102 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.602114 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.602130 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.602143 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:55Z","lastTransitionTime":"2025-12-01T19:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.705190 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.705289 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.705313 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.705342 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.705365 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:55Z","lastTransitionTime":"2025-12-01T19:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.719877 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:56:55 crc kubenswrapper[4802]: E1201 19:56:55.720085 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.808022 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.808069 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.808079 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.808092 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.808102 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:55Z","lastTransitionTime":"2025-12-01T19:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.911745 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.911821 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.911847 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.911878 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:55 crc kubenswrapper[4802]: I1201 19:56:55.911901 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:55Z","lastTransitionTime":"2025-12-01T19:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.014730 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.014789 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.014810 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.014835 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.014853 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:56Z","lastTransitionTime":"2025-12-01T19:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.117949 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.118478 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.118624 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.118778 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.118912 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:56Z","lastTransitionTime":"2025-12-01T19:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.221846 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.221884 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.221894 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.221908 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.221919 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:56Z","lastTransitionTime":"2025-12-01T19:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.324797 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.324837 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.324846 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.324861 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.324871 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:56Z","lastTransitionTime":"2025-12-01T19:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.426787 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.426823 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.426831 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.426845 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.426855 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:56Z","lastTransitionTime":"2025-12-01T19:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.528661 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.528707 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.528718 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.528735 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.528744 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:56Z","lastTransitionTime":"2025-12-01T19:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.552674 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008be62d-2cef-42a3-912f-2b2e58f8e30b-metrics-certs\") pod \"network-metrics-daemon-p8cs7\" (UID: \"008be62d-2cef-42a3-912f-2b2e58f8e30b\") " pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:56:56 crc kubenswrapper[4802]: E1201 19:56:56.552851 4802 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 19:56:56 crc kubenswrapper[4802]: E1201 19:56:56.552941 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/008be62d-2cef-42a3-912f-2b2e58f8e30b-metrics-certs podName:008be62d-2cef-42a3-912f-2b2e58f8e30b nodeName:}" failed. No retries permitted until 2025-12-01 19:57:00.552916541 +0000 UTC m=+42.115476222 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/008be62d-2cef-42a3-912f-2b2e58f8e30b-metrics-certs") pod "network-metrics-daemon-p8cs7" (UID: "008be62d-2cef-42a3-912f-2b2e58f8e30b") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.632230 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.632288 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.632305 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.632332 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.632360 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:56Z","lastTransitionTime":"2025-12-01T19:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.719436 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.719558 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.719494 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:56:56 crc kubenswrapper[4802]: E1201 19:56:56.719667 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:56:56 crc kubenswrapper[4802]: E1201 19:56:56.719816 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:56:56 crc kubenswrapper[4802]: E1201 19:56:56.719980 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.735929 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.736001 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.736025 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.736058 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.736080 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:56Z","lastTransitionTime":"2025-12-01T19:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.806585 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.828122 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:56Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.840048 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.840111 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.840128 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.840154 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.840172 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:56Z","lastTransitionTime":"2025-12-01T19:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.852638 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:56Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.876600 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5ae9cd123a33e86663401b0a8bffb6835e13283b857a173f3528ed8da869507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab25d7b3520b6624a5869f8689067345f266cd508c600ee7007d73bf0b5f6bf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:56:47Z\\\",\\\"message\\\":\\\"or removal\\\\nI1201 19:56:47.225989 6104 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 19:56:47.226021 6104 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 19:56:47.226029 6104 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 19:56:47.226089 6104 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 19:56:47.226103 6104 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 19:56:47.226145 6104 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 19:56:47.226172 6104 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 19:56:47.226186 6104 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 19:56:47.226183 6104 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 19:56:47.226235 6104 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 19:56:47.226189 6104 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 19:56:47.226270 6104 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 19:56:47.226301 6104 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 19:56:47.226380 6104 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 19:56:47.226459 6104 factory.go:656] Stopping watch factory\\\\nI1201 19:56:47.226499 6104 ovnkube.go:599] Stopped ovnkube\\\\nI1201 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:56Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.893996 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:56Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.910577 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c255f325ac4e39325605a7f8f0f99237e8efa2630bee7931972504b424d7b87a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:56Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.924238 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:56Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.938678 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:56Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.942768 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.942813 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.942824 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.942843 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.942857 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:56Z","lastTransitionTime":"2025-12-01T19:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.959893 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:56Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.973829 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:56Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:56 crc kubenswrapper[4802]: I1201 19:56:56.986732 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:56Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.001578 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:56Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.015548 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8cs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008be62d-2cef-42a3-912f-2b2e58f8e30b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8cs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:57Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.035767 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:57Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.045711 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.045771 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.045790 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.045811 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.045821 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:57Z","lastTransitionTime":"2025-12-01T19:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.049762 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:57Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.064833 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:57Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.076479 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8602c3d5a6475cdb06fb490a90a96ce1a6cf7bfbf1d3142c5b861de2838e0e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa0e17f47aef6ddde23ea47f070a3015546fd59799ea48e6618eea9f9abce64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:57Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.148170 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.148267 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.148281 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.148302 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.148315 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:57Z","lastTransitionTime":"2025-12-01T19:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.251500 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.251561 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.251581 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.251608 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.251626 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:57Z","lastTransitionTime":"2025-12-01T19:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.354508 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.354555 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.354570 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.354601 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.354613 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:57Z","lastTransitionTime":"2025-12-01T19:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.456544 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.456604 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.456614 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.456628 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.456638 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:57Z","lastTransitionTime":"2025-12-01T19:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.559946 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.559991 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.560000 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.560018 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.560028 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:57Z","lastTransitionTime":"2025-12-01T19:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.662488 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.662529 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.662540 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.662555 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.662566 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:57Z","lastTransitionTime":"2025-12-01T19:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.719241 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:56:57 crc kubenswrapper[4802]: E1201 19:56:57.719404 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.765592 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.765635 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.765643 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.765657 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.765666 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:57Z","lastTransitionTime":"2025-12-01T19:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.868246 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.868288 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.868298 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.868315 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.868327 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:57Z","lastTransitionTime":"2025-12-01T19:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.969866 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.969909 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.969921 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.969939 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:57 crc kubenswrapper[4802]: I1201 19:56:57.969951 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:57Z","lastTransitionTime":"2025-12-01T19:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.072417 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.072463 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.072474 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.072497 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.072510 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:58Z","lastTransitionTime":"2025-12-01T19:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.175789 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.175828 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.175852 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.175870 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.175899 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:58Z","lastTransitionTime":"2025-12-01T19:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.278294 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.278369 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.278402 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.278423 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.278434 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:58Z","lastTransitionTime":"2025-12-01T19:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.381718 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.381766 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.381776 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.381791 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.381800 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:58Z","lastTransitionTime":"2025-12-01T19:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.484227 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.484266 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.484279 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.484296 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.484308 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:58Z","lastTransitionTime":"2025-12-01T19:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.587030 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.587068 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.587078 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.587115 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.587124 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:58Z","lastTransitionTime":"2025-12-01T19:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.689764 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.689996 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.690056 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.690116 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.690180 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:58Z","lastTransitionTime":"2025-12-01T19:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.719451 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.719461 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.719653 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:56:58 crc kubenswrapper[4802]: E1201 19:56:58.719949 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:56:58 crc kubenswrapper[4802]: E1201 19:56:58.719907 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:56:58 crc kubenswrapper[4802]: E1201 19:56:58.720042 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.734403 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.746908 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.760415 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.771250 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.781360 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.791338 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8cs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008be62d-2cef-42a3-912f-2b2e58f8e30b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8cs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.793310 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.793355 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.793368 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.793398 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.793409 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:58Z","lastTransitionTime":"2025-12-01T19:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.809295 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8602c3d5a6475cdb06fb490a90a96ce1a6cf7bfbf1d3142c5b861de2838e0e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa0e17f47aef6ddde23ea47f070a3015546fd59799ea48e6618eea9f9abce64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.821843 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.834143 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.846733 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.858950 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.880602 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5ae9cd123a33e86663401b0a8bffb6835e13283b857a173f3528ed8da869507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab25d7b3520b6624a5869f8689067345f266cd508c600ee7007d73bf0b5f6bf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:56:47Z\\\",\\\"message\\\":\\\"or removal\\\\nI1201 19:56:47.225989 6104 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 19:56:47.226021 6104 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 19:56:47.226029 6104 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 19:56:47.226089 6104 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 19:56:47.226103 6104 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 19:56:47.226145 6104 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 19:56:47.226172 6104 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 19:56:47.226186 6104 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 19:56:47.226183 6104 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 19:56:47.226235 6104 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 19:56:47.226189 6104 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 19:56:47.226270 6104 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 19:56:47.226301 6104 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 19:56:47.226380 6104 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 19:56:47.226459 6104 factory.go:656] Stopping watch factory\\\\nI1201 19:56:47.226499 6104 ovnkube.go:599] Stopped ovnkube\\\\nI1201 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.896546 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.896583 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.896593 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.896609 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.896618 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:58Z","lastTransitionTime":"2025-12-01T19:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.902398 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.915996 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.937514 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c255f325ac4e39325605a7f8f0f99237e8efa2630bee7931972504b424d7b87a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.951925 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:56:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.998782 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.998823 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.998837 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.998853 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:58 crc kubenswrapper[4802]: I1201 19:56:58.998864 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:58Z","lastTransitionTime":"2025-12-01T19:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.101348 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.101410 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.101426 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.101464 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.101480 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:59Z","lastTransitionTime":"2025-12-01T19:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.204546 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.204569 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.204578 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.204613 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.204624 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:59Z","lastTransitionTime":"2025-12-01T19:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.307670 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.307733 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.307753 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.307779 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.307798 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:59Z","lastTransitionTime":"2025-12-01T19:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.410668 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.410736 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.410756 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.410780 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.410799 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:59Z","lastTransitionTime":"2025-12-01T19:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.513641 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.513685 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.513695 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.513712 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.513722 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:59Z","lastTransitionTime":"2025-12-01T19:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.616358 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.616406 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.616416 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.616432 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.616442 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:59Z","lastTransitionTime":"2025-12-01T19:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.718982 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:56:59 crc kubenswrapper[4802]: E1201 19:56:59.719156 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.719922 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.720016 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.720057 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.720075 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.720087 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:59Z","lastTransitionTime":"2025-12-01T19:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.824505 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.824565 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.824577 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.824616 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.824633 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:59Z","lastTransitionTime":"2025-12-01T19:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.927882 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.927932 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.927943 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.927960 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:56:59 crc kubenswrapper[4802]: I1201 19:56:59.927978 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:56:59Z","lastTransitionTime":"2025-12-01T19:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.030149 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.030221 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.030238 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.030253 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.030263 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:00Z","lastTransitionTime":"2025-12-01T19:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.133260 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.133332 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.133349 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.133375 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.133401 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:00Z","lastTransitionTime":"2025-12-01T19:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.236112 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.236144 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.236152 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.236167 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.236178 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:00Z","lastTransitionTime":"2025-12-01T19:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.338906 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.338948 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.338960 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.338974 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.338986 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:00Z","lastTransitionTime":"2025-12-01T19:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.441943 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.441977 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.441985 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.441998 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.442009 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:00Z","lastTransitionTime":"2025-12-01T19:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.544577 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.544660 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.544679 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.544706 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.544737 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:00Z","lastTransitionTime":"2025-12-01T19:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.597046 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008be62d-2cef-42a3-912f-2b2e58f8e30b-metrics-certs\") pod \"network-metrics-daemon-p8cs7\" (UID: \"008be62d-2cef-42a3-912f-2b2e58f8e30b\") " pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:00 crc kubenswrapper[4802]: E1201 19:57:00.597307 4802 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 19:57:00 crc kubenswrapper[4802]: E1201 19:57:00.597432 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/008be62d-2cef-42a3-912f-2b2e58f8e30b-metrics-certs podName:008be62d-2cef-42a3-912f-2b2e58f8e30b nodeName:}" failed. No retries permitted until 2025-12-01 19:57:08.597403038 +0000 UTC m=+50.159962709 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/008be62d-2cef-42a3-912f-2b2e58f8e30b-metrics-certs") pod "network-metrics-daemon-p8cs7" (UID: "008be62d-2cef-42a3-912f-2b2e58f8e30b") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.647048 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.647108 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.647126 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.647152 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.647170 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:00Z","lastTransitionTime":"2025-12-01T19:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.719507 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.719581 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:00 crc kubenswrapper[4802]: E1201 19:57:00.719738 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.719795 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:00 crc kubenswrapper[4802]: E1201 19:57:00.719882 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:00 crc kubenswrapper[4802]: E1201 19:57:00.720030 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.750033 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.750101 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.750131 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.750166 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.750178 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:00Z","lastTransitionTime":"2025-12-01T19:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.757149 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.757179 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.757206 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.757221 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.757232 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:00Z","lastTransitionTime":"2025-12-01T19:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:00 crc kubenswrapper[4802]: E1201 19:57:00.784007 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:00Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.788868 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.789076 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.789392 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.789645 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.789870 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:00Z","lastTransitionTime":"2025-12-01T19:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:00 crc kubenswrapper[4802]: E1201 19:57:00.807691 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:00Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.811788 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.811876 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.811900 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.811931 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.811956 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:00Z","lastTransitionTime":"2025-12-01T19:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:00 crc kubenswrapper[4802]: E1201 19:57:00.831798 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:00Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.836490 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.836540 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.836559 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.836583 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.836601 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:00Z","lastTransitionTime":"2025-12-01T19:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:00 crc kubenswrapper[4802]: E1201 19:57:00.855120 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:00Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.860410 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.860453 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.860464 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.860482 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.860495 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:00Z","lastTransitionTime":"2025-12-01T19:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:00 crc kubenswrapper[4802]: E1201 19:57:00.878583 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:00Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:00 crc kubenswrapper[4802]: E1201 19:57:00.878967 4802 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.880737 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.880799 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.880815 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.880834 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.880850 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:00Z","lastTransitionTime":"2025-12-01T19:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.984375 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.984426 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.984437 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.984451 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:00 crc kubenswrapper[4802]: I1201 19:57:00.984460 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:00Z","lastTransitionTime":"2025-12-01T19:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.087691 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.087721 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.087731 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.087754 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.087764 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:01Z","lastTransitionTime":"2025-12-01T19:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.190740 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.190773 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.190783 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.190800 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.190812 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:01Z","lastTransitionTime":"2025-12-01T19:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.292727 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.292768 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.292780 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.292796 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.292808 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:01Z","lastTransitionTime":"2025-12-01T19:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.395387 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.395607 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.395668 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.395777 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.395847 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:01Z","lastTransitionTime":"2025-12-01T19:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.498787 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.498824 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.498834 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.498849 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.498860 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:01Z","lastTransitionTime":"2025-12-01T19:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.600680 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.600727 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.600736 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.600749 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.600757 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:01Z","lastTransitionTime":"2025-12-01T19:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.703257 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.703289 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.703297 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.703310 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.703320 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:01Z","lastTransitionTime":"2025-12-01T19:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.719262 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:01 crc kubenswrapper[4802]: E1201 19:57:01.719435 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.806083 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.806121 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.806133 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.806153 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.806163 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:01Z","lastTransitionTime":"2025-12-01T19:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.909591 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.909648 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.909657 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.909678 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:01 crc kubenswrapper[4802]: I1201 19:57:01.909688 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:01Z","lastTransitionTime":"2025-12-01T19:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.012687 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.012741 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.012758 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.012779 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.012793 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:02Z","lastTransitionTime":"2025-12-01T19:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.115692 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.115754 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.115767 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.115785 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.115794 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:02Z","lastTransitionTime":"2025-12-01T19:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.218170 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.218252 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.218266 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.218281 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.218291 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:02Z","lastTransitionTime":"2025-12-01T19:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.320227 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.320294 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.320305 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.320320 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.320332 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:02Z","lastTransitionTime":"2025-12-01T19:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.422142 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.422237 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.422261 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.422290 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.422312 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:02Z","lastTransitionTime":"2025-12-01T19:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.529386 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.529427 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.529440 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.529456 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.529468 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:02Z","lastTransitionTime":"2025-12-01T19:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.631581 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.631629 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.631640 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.631655 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.631666 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:02Z","lastTransitionTime":"2025-12-01T19:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.719777 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:02 crc kubenswrapper[4802]: E1201 19:57:02.719947 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.720092 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:02 crc kubenswrapper[4802]: E1201 19:57:02.720291 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.720309 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:02 crc kubenswrapper[4802]: E1201 19:57:02.720439 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.733564 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.733625 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.733634 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.733648 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.733658 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:02Z","lastTransitionTime":"2025-12-01T19:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.835647 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.835892 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.835900 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.835913 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.835922 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:02Z","lastTransitionTime":"2025-12-01T19:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.938740 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.938787 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.938799 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.938818 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:02 crc kubenswrapper[4802]: I1201 19:57:02.938830 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:02Z","lastTransitionTime":"2025-12-01T19:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.040696 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.040733 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.040751 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.040771 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.040783 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:03Z","lastTransitionTime":"2025-12-01T19:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.143583 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.143632 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.143642 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.143659 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.143671 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:03Z","lastTransitionTime":"2025-12-01T19:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.246793 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.246852 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.246869 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.246898 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.246922 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:03Z","lastTransitionTime":"2025-12-01T19:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.349041 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.349080 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.349092 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.349108 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.349117 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:03Z","lastTransitionTime":"2025-12-01T19:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.451688 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.451728 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.451739 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.451755 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.451765 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:03Z","lastTransitionTime":"2025-12-01T19:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.554694 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.554737 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.554746 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.554762 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.554773 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:03Z","lastTransitionTime":"2025-12-01T19:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.656615 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.656640 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.656650 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.656662 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.656670 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:03Z","lastTransitionTime":"2025-12-01T19:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.719600 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:03 crc kubenswrapper[4802]: E1201 19:57:03.719782 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.758725 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.758755 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.758763 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.758779 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.758787 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:03Z","lastTransitionTime":"2025-12-01T19:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.861124 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.861190 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.861258 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.861292 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.861315 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:03Z","lastTransitionTime":"2025-12-01T19:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.964882 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.964948 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.964968 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.964998 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:03 crc kubenswrapper[4802]: I1201 19:57:03.965021 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:03Z","lastTransitionTime":"2025-12-01T19:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.067719 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.067760 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.067772 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.067788 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.067800 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:04Z","lastTransitionTime":"2025-12-01T19:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.170584 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.170641 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.170657 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.170678 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.170694 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:04Z","lastTransitionTime":"2025-12-01T19:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.272617 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.272678 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.272696 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.272713 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.272725 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:04Z","lastTransitionTime":"2025-12-01T19:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.375049 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.375095 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.375104 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.375118 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.375129 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:04Z","lastTransitionTime":"2025-12-01T19:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.477662 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.477692 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.477700 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.477712 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.477722 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:04Z","lastTransitionTime":"2025-12-01T19:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.580332 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.580393 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.580412 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.580438 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.580455 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:04Z","lastTransitionTime":"2025-12-01T19:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.683418 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.683464 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.683477 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.683493 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.683503 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:04Z","lastTransitionTime":"2025-12-01T19:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.719419 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:04 crc kubenswrapper[4802]: E1201 19:57:04.719582 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.719647 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.719739 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:04 crc kubenswrapper[4802]: E1201 19:57:04.719869 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:04 crc kubenswrapper[4802]: E1201 19:57:04.720228 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.785939 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.785985 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.785999 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.786016 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.786030 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:04Z","lastTransitionTime":"2025-12-01T19:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.888564 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.888623 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.888641 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.888663 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.888679 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:04Z","lastTransitionTime":"2025-12-01T19:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.991639 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.991686 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.991696 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.991712 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:04 crc kubenswrapper[4802]: I1201 19:57:04.991722 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:04Z","lastTransitionTime":"2025-12-01T19:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.093774 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.093808 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.093817 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.093831 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.093840 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:05Z","lastTransitionTime":"2025-12-01T19:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.196595 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.196628 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.196635 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.196649 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.196658 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:05Z","lastTransitionTime":"2025-12-01T19:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.299037 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.299104 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.299113 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.299127 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.299137 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:05Z","lastTransitionTime":"2025-12-01T19:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.401848 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.401887 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.401897 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.401912 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.401921 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:05Z","lastTransitionTime":"2025-12-01T19:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.504452 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.504491 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.504502 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.504515 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.504524 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:05Z","lastTransitionTime":"2025-12-01T19:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.607090 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.607136 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.607153 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.607179 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.607235 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:05Z","lastTransitionTime":"2025-12-01T19:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.710039 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.710137 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.710155 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.710180 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.710222 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:05Z","lastTransitionTime":"2025-12-01T19:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.719451 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:05 crc kubenswrapper[4802]: E1201 19:57:05.719643 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.813554 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.813616 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.813633 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.813662 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.813680 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:05Z","lastTransitionTime":"2025-12-01T19:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.916421 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.916498 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.916518 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.916545 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:05 crc kubenswrapper[4802]: I1201 19:57:05.916564 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:05Z","lastTransitionTime":"2025-12-01T19:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.018813 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.018860 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.018875 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.018893 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.018906 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:06Z","lastTransitionTime":"2025-12-01T19:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.120973 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.121024 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.121036 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.121054 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.121068 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:06Z","lastTransitionTime":"2025-12-01T19:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.223957 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.224021 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.224038 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.224063 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.224081 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:06Z","lastTransitionTime":"2025-12-01T19:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.326997 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.327044 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.327054 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.327071 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.327084 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:06Z","lastTransitionTime":"2025-12-01T19:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.429325 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.429368 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.429378 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.429392 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.429403 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:06Z","lastTransitionTime":"2025-12-01T19:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.531610 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.531647 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.531655 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.531670 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.531683 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:06Z","lastTransitionTime":"2025-12-01T19:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.633941 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.633992 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.634009 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.634034 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.634053 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:06Z","lastTransitionTime":"2025-12-01T19:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.719827 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.719970 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.719838 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:06 crc kubenswrapper[4802]: E1201 19:57:06.720137 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:06 crc kubenswrapper[4802]: E1201 19:57:06.720271 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:06 crc kubenswrapper[4802]: E1201 19:57:06.720380 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.737003 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.737043 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.737052 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.737067 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.737076 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:06Z","lastTransitionTime":"2025-12-01T19:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.839322 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.839364 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.839379 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.839398 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.839411 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:06Z","lastTransitionTime":"2025-12-01T19:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.942138 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.942187 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.942221 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.942239 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:06 crc kubenswrapper[4802]: I1201 19:57:06.942250 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:06Z","lastTransitionTime":"2025-12-01T19:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.045657 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.045719 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.045740 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.045764 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.045782 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:07Z","lastTransitionTime":"2025-12-01T19:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.148252 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.148320 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.148336 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.148356 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.148370 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:07Z","lastTransitionTime":"2025-12-01T19:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.251706 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.251757 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.251767 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.251782 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.251790 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:07Z","lastTransitionTime":"2025-12-01T19:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.356724 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.356773 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.357198 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.357295 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.357882 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:07Z","lastTransitionTime":"2025-12-01T19:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.459602 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.459659 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.459678 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.459702 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.459720 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:07Z","lastTransitionTime":"2025-12-01T19:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.547510 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.556614 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.562459 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.562489 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.562496 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.562509 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.562519 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:07Z","lastTransitionTime":"2025-12-01T19:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.566036 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:07Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.578017 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8602c3d5a6475cdb06fb490a90a96ce1a6cf7bfbf1d3142c5b861de2838e0e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa0e17f47aef6ddde23ea47f070a3015546fd59799ea48e6618eea9f9abce64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:07Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.593630 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:07Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.606930 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:07Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.622456 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:07Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.641875 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5ae9cd123a33e86663401b0a8bffb6835e13283b857a173f3528ed8da869507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab25d7b3520b6624a5869f8689067345f266cd508c600ee7007d73bf0b5f6bf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:56:47Z\\\",\\\"message\\\":\\\"or removal\\\\nI1201 19:56:47.225989 6104 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 19:56:47.226021 6104 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 19:56:47.226029 6104 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 19:56:47.226089 6104 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 19:56:47.226103 6104 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 19:56:47.226145 6104 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 19:56:47.226172 6104 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 19:56:47.226186 6104 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 19:56:47.226183 6104 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 19:56:47.226235 6104 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 19:56:47.226189 6104 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 19:56:47.226270 6104 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 19:56:47.226301 6104 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 19:56:47.226380 6104 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 19:56:47.226459 6104 factory.go:656] Stopping watch factory\\\\nI1201 19:56:47.226499 6104 ovnkube.go:599] Stopped ovnkube\\\\nI1201 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:07Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.652734 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:07Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.665716 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.665806 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.665827 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.665850 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.665890 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:07Z","lastTransitionTime":"2025-12-01T19:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.666775 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c255f325ac4e39325605a7f8f0f99237e8efa2630bee7931972504b424d7b87a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:07Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.680081 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:07Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.694620 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:07Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.707098 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:07Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.719118 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:07Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.719150 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:07 crc kubenswrapper[4802]: E1201 19:57:07.719261 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.728240 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:07Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.737397 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:07Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.747055 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8cs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008be62d-2cef-42a3-912f-2b2e58f8e30b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8cs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:07Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.758650 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:07Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.768341 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.768370 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.768378 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.768393 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.768403 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:07Z","lastTransitionTime":"2025-12-01T19:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.872373 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.872447 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.872467 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.872497 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.872517 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:07Z","lastTransitionTime":"2025-12-01T19:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.975177 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.975290 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.975308 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.975339 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:07 crc kubenswrapper[4802]: I1201 19:57:07.975359 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:07Z","lastTransitionTime":"2025-12-01T19:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.077888 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.077997 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.078024 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.078061 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.078089 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:08Z","lastTransitionTime":"2025-12-01T19:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.181121 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.181242 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.181262 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.181283 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.181295 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:08Z","lastTransitionTime":"2025-12-01T19:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.285354 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.285438 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.285458 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.285488 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.285509 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:08Z","lastTransitionTime":"2025-12-01T19:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.388944 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.389056 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.389084 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.389129 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.389155 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:08Z","lastTransitionTime":"2025-12-01T19:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.484003 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.484178 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.484307 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:08 crc kubenswrapper[4802]: E1201 19:57:08.484412 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:57:40.484365517 +0000 UTC m=+82.046925178 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:57:08 crc kubenswrapper[4802]: E1201 19:57:08.484512 4802 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 19:57:08 crc kubenswrapper[4802]: E1201 19:57:08.484532 4802 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 19:57:08 crc kubenswrapper[4802]: E1201 19:57:08.484609 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 19:57:40.484582344 +0000 UTC m=+82.047141995 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 19:57:08 crc kubenswrapper[4802]: E1201 19:57:08.484705 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 19:57:40.484643126 +0000 UTC m=+82.047202807 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.492382 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.492435 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.492450 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.492474 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.492490 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:08Z","lastTransitionTime":"2025-12-01T19:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.585148 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.585337 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:08 crc kubenswrapper[4802]: E1201 19:57:08.585524 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 19:57:08 crc kubenswrapper[4802]: E1201 19:57:08.585524 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 19:57:08 crc kubenswrapper[4802]: E1201 19:57:08.585594 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 19:57:08 crc kubenswrapper[4802]: E1201 19:57:08.585617 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 19:57:08 crc kubenswrapper[4802]: E1201 19:57:08.585630 4802 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:57:08 crc kubenswrapper[4802]: E1201 19:57:08.585630 4802 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:57:08 crc kubenswrapper[4802]: E1201 19:57:08.585740 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 19:57:40.585701867 +0000 UTC m=+82.148261698 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:57:08 crc kubenswrapper[4802]: E1201 19:57:08.585767 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 19:57:40.585757689 +0000 UTC m=+82.148317580 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.595285 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.595353 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.595368 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.595415 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.595438 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:08Z","lastTransitionTime":"2025-12-01T19:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.686192 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008be62d-2cef-42a3-912f-2b2e58f8e30b-metrics-certs\") pod \"network-metrics-daemon-p8cs7\" (UID: \"008be62d-2cef-42a3-912f-2b2e58f8e30b\") " pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:08 crc kubenswrapper[4802]: E1201 19:57:08.686442 4802 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 19:57:08 crc kubenswrapper[4802]: E1201 19:57:08.686536 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/008be62d-2cef-42a3-912f-2b2e58f8e30b-metrics-certs podName:008be62d-2cef-42a3-912f-2b2e58f8e30b nodeName:}" failed. No retries permitted until 2025-12-01 19:57:24.686512409 +0000 UTC m=+66.249072080 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/008be62d-2cef-42a3-912f-2b2e58f8e30b-metrics-certs") pod "network-metrics-daemon-p8cs7" (UID: "008be62d-2cef-42a3-912f-2b2e58f8e30b") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.698797 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.698860 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.698881 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.698914 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.698936 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:08Z","lastTransitionTime":"2025-12-01T19:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.719258 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.719293 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.719293 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:08 crc kubenswrapper[4802]: E1201 19:57:08.719424 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:08 crc kubenswrapper[4802]: E1201 19:57:08.719662 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:08 crc kubenswrapper[4802]: E1201 19:57:08.719829 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.742690 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:08Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.761023 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:08Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.781050 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:08Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.801114 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5ae9cd123a33e86663401b0a8bffb6835e13283b857a173f3528ed8da869507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab25d7b3520b6624a5869f8689067345f266cd508c600ee7007d73bf0b5f6bf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:56:47Z\\\",\\\"message\\\":\\\"or removal\\\\nI1201 19:56:47.225989 6104 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 19:56:47.226021 6104 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 19:56:47.226029 6104 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 19:56:47.226089 6104 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 19:56:47.226103 6104 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 19:56:47.226145 6104 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 19:56:47.226172 6104 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 19:56:47.226186 6104 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 19:56:47.226183 6104 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 19:56:47.226235 6104 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 19:56:47.226189 6104 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 19:56:47.226270 6104 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 19:56:47.226301 6104 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 19:56:47.226380 6104 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 19:56:47.226459 6104 factory.go:656] Stopping watch factory\\\\nI1201 19:56:47.226499 6104 ovnkube.go:599] Stopped ovnkube\\\\nI1201 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:08Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.802929 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.802972 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.802987 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.803008 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.803023 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:08Z","lastTransitionTime":"2025-12-01T19:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.822639 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:08Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.842922 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:08Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.862515 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:08Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.885183 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c255f325ac4e39325605a7f8f0f99237e8efa2630bee7931972504b424d7b87a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:08Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.902558 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc9332ba-a2ce-493d-904b-c4f4d7499d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72a6498825f7c8427eefd172eebe1f4b8dad960f1d474c78c06f901f637ecd3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68fa39b8f5c24877fa42088699094e8bdce24f0679d8c50294f569a771789ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eaeea38dc683a9459f24ab9a0de9c3701ae217e27b425b7eed25530cb7523af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:08Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.906560 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.906631 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.906655 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.906727 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.906750 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:08Z","lastTransitionTime":"2025-12-01T19:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.920705 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:08Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.945797 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:08Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.961763 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:08Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.975123 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:08Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:08 crc kubenswrapper[4802]: I1201 19:57:08.986253 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:08Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.003269 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8cs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008be62d-2cef-42a3-912f-2b2e58f8e30b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8cs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:09Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.010639 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.010720 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.010740 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.010772 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.010798 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:09Z","lastTransitionTime":"2025-12-01T19:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.051026 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:09Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.060700 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7nr2_933fb25a-a01a-464e-838a-df1d07bca99e/ovnkube-controller/1.log" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.061900 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7nr2_933fb25a-a01a-464e-838a-df1d07bca99e/ovnkube-controller/0.log" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.072480 4802 generic.go:334] "Generic (PLEG): container finished" podID="933fb25a-a01a-464e-838a-df1d07bca99e" containerID="c5ae9cd123a33e86663401b0a8bffb6835e13283b857a173f3528ed8da869507" exitCode=1 Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.072566 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" event={"ID":"933fb25a-a01a-464e-838a-df1d07bca99e","Type":"ContainerDied","Data":"c5ae9cd123a33e86663401b0a8bffb6835e13283b857a173f3528ed8da869507"} Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.072660 4802 scope.go:117] "RemoveContainer" containerID="7ab25d7b3520b6624a5869f8689067345f266cd508c600ee7007d73bf0b5f6bf" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.073901 4802 scope.go:117] "RemoveContainer" containerID="c5ae9cd123a33e86663401b0a8bffb6835e13283b857a173f3528ed8da869507" Dec 01 19:57:09 crc kubenswrapper[4802]: E1201 19:57:09.074156 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-t7nr2_openshift-ovn-kubernetes(933fb25a-a01a-464e-838a-df1d07bca99e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.090577 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8602c3d5a6475cdb06fb490a90a96ce1a6cf7bfbf1d3142c5b861de2838e0e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa0e17f47aef6ddde23ea47f070a3015546fd59799ea48e6618eea9f9abce64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:09Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.107412 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:09Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.114183 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.114257 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.114268 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.114302 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.114316 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:09Z","lastTransitionTime":"2025-12-01T19:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.121585 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8cs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008be62d-2cef-42a3-912f-2b2e58f8e30b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8cs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:09Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.138228 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc9332ba-a2ce-493d-904b-c4f4d7499d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72a6498825f7c8427eefd172eebe1f4b8dad960f1d474c78c06f901f637ecd3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68fa39b8f5c24877fa42088699094e8bdce24f0679d8c50294f569a771789ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eaeea38dc683a9459f24ab9a0de9c3701ae217e27b425b7eed25530cb7523af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:09Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.154530 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:09Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.170525 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:09Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.193416 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:09Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.208682 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:09Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.217683 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.217737 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.217748 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.217772 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.217786 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:09Z","lastTransitionTime":"2025-12-01T19:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.228247 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:09Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.248340 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8602c3d5a6475cdb06fb490a90a96ce1a6cf7bfbf1d3142c5b861de2838e0e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa0e17f47aef6ddde23ea47f070a3015546fd59799ea48e6618eea9f9abce64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:09Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.276036 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5ae9cd123a33e86663401b0a8bffb6835e13283b857a173f3528ed8da869507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab25d7b3520b6624a5869f8689067345f266cd508c600ee7007d73bf0b5f6bf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:56:47Z\\\",\\\"message\\\":\\\"or removal\\\\nI1201 19:56:47.225989 6104 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 19:56:47.226021 6104 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 19:56:47.226029 6104 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 19:56:47.226089 6104 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 19:56:47.226103 6104 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 19:56:47.226145 6104 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 19:56:47.226172 6104 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 19:56:47.226186 6104 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 19:56:47.226183 6104 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 19:56:47.226235 6104 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 19:56:47.226189 6104 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 19:56:47.226270 6104 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 19:56:47.226301 6104 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 19:56:47.226380 6104 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 19:56:47.226459 6104 factory.go:656] Stopping watch factory\\\\nI1201 19:56:47.226499 6104 ovnkube.go:599] Stopped ovnkube\\\\nI1201 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5ae9cd123a33e86663401b0a8bffb6835e13283b857a173f3528ed8da869507\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:57:08Z\\\",\\\"message\\\":\\\"gmap:NhpE8g== operator.openshift.io/dep-openshift-apiserver.trusted-ca-bundle.configmap:ElMHxA==]\\\\nI1201 19:57:08.649562 6233 controller.go:257] Controller udn-host-isolation-manager: error found while processing openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h44gr: failed to check if pod openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h44gr is in primary UDN: could not find OVN pod annotation in map[openshift.io/required-scc:nonroot-v2 openshift.io/scc:nonroot-v2 seccomp.security.alpha.kubernetes.io/pod:runtime/default]\\\\nI1201 19:57:08.649698 6233 controller.go:257] Controller udn-host-isolation-manager: error found while processing openshift-machine-config-operator/machine-config-controller-84d6567774-w2f2s: failed to check if pod openshift-machine-config-operator/machine-config-controller-84d6567774-w2f2s is in primary UDN: could not find OVN pod annotation in map[openshift.io/required-scc:restricted-v2 openshift.io/scc:restricted-v2 seccomp.security.alpha.kubernetes.io/pod:runtime/default]\\\\nE1201 19:57:08.712538 6233 shared_informer.go:316] \\\\\\\"Unhandled Error\\\\\\\" err=\\\\\\\"unable to sync caches for ovn-lb-controller\\\\\\\" logger=\\\\\\\"UnhandledError\\\\\\\"\\\\nI1201 19:57:08.713800 6233 ovnkube.go:599] Stopped ovnkube\\\\nI1201 19:57:08.713874 6233 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:09Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.298735 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:09Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.320441 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:09Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.321343 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.321429 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.321486 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.321517 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.321575 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:09Z","lastTransitionTime":"2025-12-01T19:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.337531 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:09Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.356348 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:09Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.374582 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:09Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.393600 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:09Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.417820 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c255f325ac4e39325605a7f8f0f99237e8efa2630bee7931972504b424d7b87a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:09Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.425555 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.425614 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.425633 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.425667 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.425688 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:09Z","lastTransitionTime":"2025-12-01T19:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.529463 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.529509 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.529524 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.529544 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.529557 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:09Z","lastTransitionTime":"2025-12-01T19:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.635093 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.635173 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.635198 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.635289 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.635311 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:09Z","lastTransitionTime":"2025-12-01T19:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.719780 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:09 crc kubenswrapper[4802]: E1201 19:57:09.720029 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.738932 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.738989 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.739002 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.739025 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.739042 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:09Z","lastTransitionTime":"2025-12-01T19:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.842858 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.842902 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.842911 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.842929 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.842940 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:09Z","lastTransitionTime":"2025-12-01T19:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.945852 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.945892 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.945903 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.945922 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:09 crc kubenswrapper[4802]: I1201 19:57:09.945934 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:09Z","lastTransitionTime":"2025-12-01T19:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.050176 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.050302 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.050360 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.050390 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.050446 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:10Z","lastTransitionTime":"2025-12-01T19:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.080350 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7nr2_933fb25a-a01a-464e-838a-df1d07bca99e/ovnkube-controller/1.log" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.154458 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.154530 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.154551 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.154583 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.154605 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:10Z","lastTransitionTime":"2025-12-01T19:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.258108 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.258154 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.258163 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.258184 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.258196 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:10Z","lastTransitionTime":"2025-12-01T19:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.361832 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.361895 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.361911 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.361939 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.361960 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:10Z","lastTransitionTime":"2025-12-01T19:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.466272 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.466330 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.466350 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.466379 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.466398 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:10Z","lastTransitionTime":"2025-12-01T19:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.570312 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.570381 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.570402 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.570434 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.570455 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:10Z","lastTransitionTime":"2025-12-01T19:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.673454 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.673526 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.673545 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.673574 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.673591 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:10Z","lastTransitionTime":"2025-12-01T19:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.719910 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.719999 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:10 crc kubenswrapper[4802]: E1201 19:57:10.720108 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.720156 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:10 crc kubenswrapper[4802]: E1201 19:57:10.720330 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:10 crc kubenswrapper[4802]: E1201 19:57:10.720415 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.777247 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.777293 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.777308 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.777329 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.777342 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:10Z","lastTransitionTime":"2025-12-01T19:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.879454 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.879506 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.879521 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.879543 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.879559 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:10Z","lastTransitionTime":"2025-12-01T19:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.981790 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.981845 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.981857 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.981877 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:10 crc kubenswrapper[4802]: I1201 19:57:10.981891 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:10Z","lastTransitionTime":"2025-12-01T19:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:11 crc kubenswrapper[4802]: E1201 19:57:11.005889 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:11Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.011923 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.011978 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.011992 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.012012 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.012026 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:11Z","lastTransitionTime":"2025-12-01T19:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:11 crc kubenswrapper[4802]: E1201 19:57:11.032777 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:11Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.038631 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.038689 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.038709 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.038740 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.038761 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:11Z","lastTransitionTime":"2025-12-01T19:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:11 crc kubenswrapper[4802]: E1201 19:57:11.059249 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:11Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.064017 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.064070 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.064091 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.064115 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.064134 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:11Z","lastTransitionTime":"2025-12-01T19:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:11 crc kubenswrapper[4802]: E1201 19:57:11.080820 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:11Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.085568 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.085601 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.085612 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.085649 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.085661 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:11Z","lastTransitionTime":"2025-12-01T19:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:11 crc kubenswrapper[4802]: E1201 19:57:11.102090 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:11Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:11 crc kubenswrapper[4802]: E1201 19:57:11.102256 4802 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.103980 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.104039 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.104060 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.104097 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.104124 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:11Z","lastTransitionTime":"2025-12-01T19:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.206817 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.206881 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.206894 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.206911 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.206923 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:11Z","lastTransitionTime":"2025-12-01T19:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.309911 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.309991 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.310014 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.310049 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.310083 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:11Z","lastTransitionTime":"2025-12-01T19:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.412759 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.412848 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.412873 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.412909 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.412952 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:11Z","lastTransitionTime":"2025-12-01T19:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.516798 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.516862 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.516882 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.516908 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.516925 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:11Z","lastTransitionTime":"2025-12-01T19:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.620485 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.620547 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.620561 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.620586 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.620601 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:11Z","lastTransitionTime":"2025-12-01T19:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.719373 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:11 crc kubenswrapper[4802]: E1201 19:57:11.719495 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.723333 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.723392 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.723405 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.723431 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.723444 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:11Z","lastTransitionTime":"2025-12-01T19:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.826044 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.826120 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.826133 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.826150 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.826161 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:11Z","lastTransitionTime":"2025-12-01T19:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.928929 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.928971 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.928981 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.928995 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:11 crc kubenswrapper[4802]: I1201 19:57:11.929006 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:11Z","lastTransitionTime":"2025-12-01T19:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.031789 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.031823 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.031832 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.031847 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.031856 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:12Z","lastTransitionTime":"2025-12-01T19:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.134271 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.134339 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.134356 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.134371 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.134382 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:12Z","lastTransitionTime":"2025-12-01T19:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.237576 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.237628 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.237640 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.237660 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.237671 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:12Z","lastTransitionTime":"2025-12-01T19:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.341121 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.341723 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.341740 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.341761 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.341774 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:12Z","lastTransitionTime":"2025-12-01T19:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.443730 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.443763 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.443775 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.443790 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.443799 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:12Z","lastTransitionTime":"2025-12-01T19:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.546358 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.546393 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.546402 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.546418 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.546453 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:12Z","lastTransitionTime":"2025-12-01T19:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.649806 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.649849 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.649859 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.649880 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.649891 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:12Z","lastTransitionTime":"2025-12-01T19:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.719795 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.719824 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:12 crc kubenswrapper[4802]: E1201 19:57:12.719933 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.720054 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:12 crc kubenswrapper[4802]: E1201 19:57:12.720234 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:12 crc kubenswrapper[4802]: E1201 19:57:12.720631 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.752874 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.752918 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.752927 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.752943 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.752956 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:12Z","lastTransitionTime":"2025-12-01T19:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.856171 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.856261 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.856279 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.856347 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.856364 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:12Z","lastTransitionTime":"2025-12-01T19:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.959793 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.959899 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.959922 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.959955 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:12 crc kubenswrapper[4802]: I1201 19:57:12.959973 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:12Z","lastTransitionTime":"2025-12-01T19:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.063294 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.063369 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.063387 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.063415 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.063431 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:13Z","lastTransitionTime":"2025-12-01T19:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.165856 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.165889 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.165897 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.165912 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.165921 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:13Z","lastTransitionTime":"2025-12-01T19:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.268959 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.269021 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.269038 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.269063 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.269082 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:13Z","lastTransitionTime":"2025-12-01T19:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.371698 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.371736 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.371746 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.371762 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.371773 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:13Z","lastTransitionTime":"2025-12-01T19:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.474434 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.474508 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.474521 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.474541 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.474554 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:13Z","lastTransitionTime":"2025-12-01T19:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.577440 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.577489 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.577501 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.577521 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.577534 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:13Z","lastTransitionTime":"2025-12-01T19:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.679532 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.679597 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.679614 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.679639 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.679660 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:13Z","lastTransitionTime":"2025-12-01T19:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.719425 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:13 crc kubenswrapper[4802]: E1201 19:57:13.719606 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.781681 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.781725 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.781737 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.781759 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.781771 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:13Z","lastTransitionTime":"2025-12-01T19:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.883643 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.883682 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.883693 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.883708 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.883718 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:13Z","lastTransitionTime":"2025-12-01T19:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.986313 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.986373 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.986381 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.986397 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:13 crc kubenswrapper[4802]: I1201 19:57:13.986409 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:13Z","lastTransitionTime":"2025-12-01T19:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.090038 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.090112 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.090140 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.090168 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.090190 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:14Z","lastTransitionTime":"2025-12-01T19:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.192820 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.192854 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.192862 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.192877 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.192888 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:14Z","lastTransitionTime":"2025-12-01T19:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.295024 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.295073 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.295094 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.295115 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.295131 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:14Z","lastTransitionTime":"2025-12-01T19:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.397463 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.397526 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.397548 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.397575 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.397597 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:14Z","lastTransitionTime":"2025-12-01T19:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.500569 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.500650 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.500677 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.500706 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.500724 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:14Z","lastTransitionTime":"2025-12-01T19:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.603573 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.603614 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.603624 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.603640 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.603651 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:14Z","lastTransitionTime":"2025-12-01T19:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.706247 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.706289 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.706298 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.706313 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.706323 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:14Z","lastTransitionTime":"2025-12-01T19:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.719637 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.719708 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.719733 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:14 crc kubenswrapper[4802]: E1201 19:57:14.719807 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:14 crc kubenswrapper[4802]: E1201 19:57:14.720039 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:14 crc kubenswrapper[4802]: E1201 19:57:14.720166 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.808624 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.808734 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.808745 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.808759 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.808767 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:14Z","lastTransitionTime":"2025-12-01T19:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.910610 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.910682 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.910696 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.910712 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:14 crc kubenswrapper[4802]: I1201 19:57:14.910722 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:14Z","lastTransitionTime":"2025-12-01T19:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.013151 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.013252 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.013278 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.013307 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.013329 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:15Z","lastTransitionTime":"2025-12-01T19:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.115651 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.115712 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.115730 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.115754 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.115771 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:15Z","lastTransitionTime":"2025-12-01T19:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.218736 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.218792 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.218808 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.218832 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.218849 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:15Z","lastTransitionTime":"2025-12-01T19:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.322378 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.322444 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.322468 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.322498 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.322519 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:15Z","lastTransitionTime":"2025-12-01T19:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.426119 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.426160 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.426171 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.426188 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.426218 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:15Z","lastTransitionTime":"2025-12-01T19:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.529367 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.529425 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.529441 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.529463 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.529480 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:15Z","lastTransitionTime":"2025-12-01T19:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.632476 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.632568 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.632584 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.632609 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.632625 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:15Z","lastTransitionTime":"2025-12-01T19:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.719427 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:15 crc kubenswrapper[4802]: E1201 19:57:15.719618 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.736630 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.736696 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.736718 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.736766 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.736790 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:15Z","lastTransitionTime":"2025-12-01T19:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.840382 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.840442 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.840460 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.840485 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.840507 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:15Z","lastTransitionTime":"2025-12-01T19:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.943446 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.943501 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.943520 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.943544 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:15 crc kubenswrapper[4802]: I1201 19:57:15.943560 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:15Z","lastTransitionTime":"2025-12-01T19:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.046539 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.046678 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.046708 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.046737 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.046757 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:16Z","lastTransitionTime":"2025-12-01T19:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.149233 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.149302 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.149325 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.149354 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.149375 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:16Z","lastTransitionTime":"2025-12-01T19:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.252279 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.252318 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.252329 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.252345 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.252357 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:16Z","lastTransitionTime":"2025-12-01T19:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.354569 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.354602 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.354610 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.354623 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.354632 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:16Z","lastTransitionTime":"2025-12-01T19:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.458969 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.459022 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.459036 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.459058 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.459072 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:16Z","lastTransitionTime":"2025-12-01T19:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.562400 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.562435 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.562448 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.562465 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.562476 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:16Z","lastTransitionTime":"2025-12-01T19:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.666021 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.666077 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.666086 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.666105 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.666115 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:16Z","lastTransitionTime":"2025-12-01T19:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.719233 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:16 crc kubenswrapper[4802]: E1201 19:57:16.719399 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.719488 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.719569 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:16 crc kubenswrapper[4802]: E1201 19:57:16.719671 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:16 crc kubenswrapper[4802]: E1201 19:57:16.719820 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.769224 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.769265 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.769280 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.769297 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.769310 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:16Z","lastTransitionTime":"2025-12-01T19:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.872361 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.872479 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.872510 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.872547 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.872573 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:16Z","lastTransitionTime":"2025-12-01T19:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.975536 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.975601 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.975619 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.975651 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:16 crc kubenswrapper[4802]: I1201 19:57:16.975673 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:16Z","lastTransitionTime":"2025-12-01T19:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.078477 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.078551 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.078573 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.078603 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.078626 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:17Z","lastTransitionTime":"2025-12-01T19:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.182131 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.182228 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.182257 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.182285 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.182304 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:17Z","lastTransitionTime":"2025-12-01T19:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.285607 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.285677 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.285695 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.285732 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.285774 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:17Z","lastTransitionTime":"2025-12-01T19:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.388590 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.388630 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.388638 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.388652 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.388661 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:17Z","lastTransitionTime":"2025-12-01T19:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.492045 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.492095 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.492107 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.492126 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.492140 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:17Z","lastTransitionTime":"2025-12-01T19:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.594372 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.594409 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.594420 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.594439 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.594451 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:17Z","lastTransitionTime":"2025-12-01T19:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.697439 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.697485 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.697496 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.697514 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.697526 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:17Z","lastTransitionTime":"2025-12-01T19:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.719361 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:17 crc kubenswrapper[4802]: E1201 19:57:17.719523 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.799951 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.800015 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.800037 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.800108 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.800132 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:17Z","lastTransitionTime":"2025-12-01T19:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.902983 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.903014 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.903022 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.903037 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:17 crc kubenswrapper[4802]: I1201 19:57:17.903046 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:17Z","lastTransitionTime":"2025-12-01T19:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.006461 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.006584 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.006684 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.006719 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.006743 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:18Z","lastTransitionTime":"2025-12-01T19:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.108992 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.109080 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.109104 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.109133 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.109156 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:18Z","lastTransitionTime":"2025-12-01T19:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.211494 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.211537 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.211554 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.211573 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.211583 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:18Z","lastTransitionTime":"2025-12-01T19:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.313430 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.313473 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.313488 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.313504 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.313514 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:18Z","lastTransitionTime":"2025-12-01T19:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.415879 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.415949 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.415972 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.416003 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.416027 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:18Z","lastTransitionTime":"2025-12-01T19:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.519180 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.519271 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.519288 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.519351 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.519372 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:18Z","lastTransitionTime":"2025-12-01T19:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.622191 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.622252 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.622260 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.622290 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.622298 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:18Z","lastTransitionTime":"2025-12-01T19:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.719453 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.719591 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.720460 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:18 crc kubenswrapper[4802]: E1201 19:57:18.720425 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:18 crc kubenswrapper[4802]: E1201 19:57:18.720886 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:18 crc kubenswrapper[4802]: E1201 19:57:18.721439 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.728949 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.729492 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.729544 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.729575 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.729626 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:18Z","lastTransitionTime":"2025-12-01T19:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.742117 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:18Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.759170 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:18Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.778027 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:18Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.796117 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5ae9cd123a33e86663401b0a8bffb6835e13283b857a173f3528ed8da869507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ab25d7b3520b6624a5869f8689067345f266cd508c600ee7007d73bf0b5f6bf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:56:47Z\\\",\\\"message\\\":\\\"or removal\\\\nI1201 19:56:47.225989 6104 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 19:56:47.226021 6104 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 19:56:47.226029 6104 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 19:56:47.226089 6104 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 19:56:47.226103 6104 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 19:56:47.226145 6104 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 19:56:47.226172 6104 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 19:56:47.226186 6104 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 19:56:47.226183 6104 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 19:56:47.226235 6104 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 19:56:47.226189 6104 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 19:56:47.226270 6104 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 19:56:47.226301 6104 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 19:56:47.226380 6104 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 19:56:47.226459 6104 factory.go:656] Stopping watch factory\\\\nI1201 19:56:47.226499 6104 ovnkube.go:599] Stopped ovnkube\\\\nI1201 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5ae9cd123a33e86663401b0a8bffb6835e13283b857a173f3528ed8da869507\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:57:08Z\\\",\\\"message\\\":\\\"gmap:NhpE8g== operator.openshift.io/dep-openshift-apiserver.trusted-ca-bundle.configmap:ElMHxA==]\\\\nI1201 19:57:08.649562 6233 controller.go:257] Controller udn-host-isolation-manager: error found while processing openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h44gr: failed to check if pod openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h44gr is in primary UDN: could not find OVN pod annotation in map[openshift.io/required-scc:nonroot-v2 openshift.io/scc:nonroot-v2 seccomp.security.alpha.kubernetes.io/pod:runtime/default]\\\\nI1201 19:57:08.649698 6233 controller.go:257] Controller udn-host-isolation-manager: error found while processing openshift-machine-config-operator/machine-config-controller-84d6567774-w2f2s: failed to check if pod openshift-machine-config-operator/machine-config-controller-84d6567774-w2f2s is in primary UDN: could not find OVN pod annotation in map[openshift.io/required-scc:restricted-v2 openshift.io/scc:restricted-v2 seccomp.security.alpha.kubernetes.io/pod:runtime/default]\\\\nE1201 19:57:08.712538 6233 shared_informer.go:316] \\\\\\\"Unhandled Error\\\\\\\" err=\\\\\\\"unable to sync caches for ovn-lb-controller\\\\\\\" logger=\\\\\\\"UnhandledError\\\\\\\"\\\\nI1201 19:57:08.713800 6233 ovnkube.go:599] Stopped ovnkube\\\\nI1201 19:57:08.713874 6233 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:18Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.812456 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:18Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.831871 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.831958 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.831976 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.832000 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.832016 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:18Z","lastTransitionTime":"2025-12-01T19:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.835106 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c255f325ac4e39325605a7f8f0f99237e8efa2630bee7931972504b424d7b87a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:18Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.850546 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:18Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.866237 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:18Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.879382 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:18Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.895660 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:18Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.907636 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:18Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.916960 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:18Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.933331 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8cs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008be62d-2cef-42a3-912f-2b2e58f8e30b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8cs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:18Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.933828 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.933863 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.933875 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.933892 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.933904 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:18Z","lastTransitionTime":"2025-12-01T19:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.951119 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc9332ba-a2ce-493d-904b-c4f4d7499d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72a6498825f7c8427eefd172eebe1f4b8dad960f1d474c78c06f901f637ecd3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68fa39b8f5c24877fa42088699094e8bdce24f0679d8c50294f569a771789ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eaeea38dc683a9459f24ab9a0de9c3701ae217e27b425b7eed25530cb7523af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:18Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.968274 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:18Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.981617 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:18Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:18 crc kubenswrapper[4802]: I1201 19:57:18.993470 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8602c3d5a6475cdb06fb490a90a96ce1a6cf7bfbf1d3142c5b861de2838e0e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa0e17f47aef6ddde23ea47f070a3015546fd59799ea48e6618eea9f9abce64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:18Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.036932 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.036983 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.036996 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.037012 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.037022 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:19Z","lastTransitionTime":"2025-12-01T19:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.139761 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.139836 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.139856 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.139885 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.139905 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:19Z","lastTransitionTime":"2025-12-01T19:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.242551 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.242607 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.242619 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.242637 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.242647 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:19Z","lastTransitionTime":"2025-12-01T19:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.345500 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.345586 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.345606 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.345640 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.345659 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:19Z","lastTransitionTime":"2025-12-01T19:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.448960 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.448991 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.449000 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.449015 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.449024 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:19Z","lastTransitionTime":"2025-12-01T19:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.551878 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.551969 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.551990 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.552024 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.552043 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:19Z","lastTransitionTime":"2025-12-01T19:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.655117 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.655179 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.655233 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.655264 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.655287 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:19Z","lastTransitionTime":"2025-12-01T19:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.719494 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:19 crc kubenswrapper[4802]: E1201 19:57:19.719833 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.757784 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.757827 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.757837 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.757855 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.757867 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:19Z","lastTransitionTime":"2025-12-01T19:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.861241 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.861311 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.861330 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.861358 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.861377 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:19Z","lastTransitionTime":"2025-12-01T19:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.964176 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.964291 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.964304 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.964378 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:19 crc kubenswrapper[4802]: I1201 19:57:19.964395 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:19Z","lastTransitionTime":"2025-12-01T19:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.067072 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.067117 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.067127 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.067142 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.067153 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:20Z","lastTransitionTime":"2025-12-01T19:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.170142 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.170249 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.170274 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.170303 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.170328 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:20Z","lastTransitionTime":"2025-12-01T19:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.273288 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.273344 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.273353 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.273368 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.273376 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:20Z","lastTransitionTime":"2025-12-01T19:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.375585 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.375620 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.375628 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.375644 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.375653 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:20Z","lastTransitionTime":"2025-12-01T19:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.477897 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.477925 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.477934 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.477946 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.477955 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:20Z","lastTransitionTime":"2025-12-01T19:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.580069 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.580095 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.580103 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.580116 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.580125 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:20Z","lastTransitionTime":"2025-12-01T19:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.682607 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.682637 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.682648 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.682663 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.682673 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:20Z","lastTransitionTime":"2025-12-01T19:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.721722 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:20 crc kubenswrapper[4802]: E1201 19:57:20.721838 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.722024 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:20 crc kubenswrapper[4802]: E1201 19:57:20.722092 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.722746 4802 scope.go:117] "RemoveContainer" containerID="c5ae9cd123a33e86663401b0a8bffb6835e13283b857a173f3528ed8da869507" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.723008 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:20 crc kubenswrapper[4802]: E1201 19:57:20.723083 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.745295 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:20Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.762882 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:20Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.782013 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:20Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.785126 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.785157 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.785168 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.785184 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.785211 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:20Z","lastTransitionTime":"2025-12-01T19:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.804087 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c255f325ac4e39325605a7f8f0f99237e8efa2630bee7931972504b424d7b87a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:20Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.823179 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc9332ba-a2ce-493d-904b-c4f4d7499d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72a6498825f7c8427eefd172eebe1f4b8dad960f1d474c78c06f901f637ecd3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68fa39b8f5c24877fa42088699094e8bdce24f0679d8c50294f569a771789ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eaeea38dc683a9459f24ab9a0de9c3701ae217e27b425b7eed25530cb7523af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:20Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.852827 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:20Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.869377 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:20Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.884527 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:20Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.886922 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.886943 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.886952 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.886966 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.886975 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:20Z","lastTransitionTime":"2025-12-01T19:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.899784 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:20Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.915545 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:20Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.929980 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8cs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008be62d-2cef-42a3-912f-2b2e58f8e30b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8cs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:20Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.950310 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:20Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.966088 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8602c3d5a6475cdb06fb490a90a96ce1a6cf7bfbf1d3142c5b861de2838e0e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa0e17f47aef6ddde23ea47f070a3015546fd59799ea48e6618eea9f9abce64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:20Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:20 crc kubenswrapper[4802]: I1201 19:57:20.989066 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:20Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.005544 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.005859 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.005880 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.007845 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.007912 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:21Z","lastTransitionTime":"2025-12-01T19:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.008732 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:21Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.027604 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:21Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.056232 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5ae9cd123a33e86663401b0a8bffb6835e13283b857a173f3528ed8da869507\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5ae9cd123a33e86663401b0a8bffb6835e13283b857a173f3528ed8da869507\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:57:08Z\\\",\\\"message\\\":\\\"gmap:NhpE8g== operator.openshift.io/dep-openshift-apiserver.trusted-ca-bundle.configmap:ElMHxA==]\\\\nI1201 19:57:08.649562 6233 controller.go:257] Controller udn-host-isolation-manager: error found while processing openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h44gr: failed to check if pod openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h44gr is in primary UDN: could not find OVN pod annotation in map[openshift.io/required-scc:nonroot-v2 openshift.io/scc:nonroot-v2 seccomp.security.alpha.kubernetes.io/pod:runtime/default]\\\\nI1201 19:57:08.649698 6233 controller.go:257] Controller udn-host-isolation-manager: error found while processing openshift-machine-config-operator/machine-config-controller-84d6567774-w2f2s: failed to check if pod openshift-machine-config-operator/machine-config-controller-84d6567774-w2f2s is in primary UDN: could not find OVN pod annotation in map[openshift.io/required-scc:restricted-v2 openshift.io/scc:restricted-v2 seccomp.security.alpha.kubernetes.io/pod:runtime/default]\\\\nE1201 19:57:08.712538 6233 shared_informer.go:316] \\\\\\\"Unhandled Error\\\\\\\" err=\\\\\\\"unable to sync caches for ovn-lb-controller\\\\\\\" logger=\\\\\\\"UnhandledError\\\\\\\"\\\\nI1201 19:57:08.713800 6233 ovnkube.go:599] Stopped ovnkube\\\\nI1201 19:57:08.713874 6233 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-t7nr2_openshift-ovn-kubernetes(933fb25a-a01a-464e-838a-df1d07bca99e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:21Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.110957 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.110993 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.111015 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.111033 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.111047 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:21Z","lastTransitionTime":"2025-12-01T19:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.119030 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7nr2_933fb25a-a01a-464e-838a-df1d07bca99e/ovnkube-controller/1.log" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.121930 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" event={"ID":"933fb25a-a01a-464e-838a-df1d07bca99e","Type":"ContainerStarted","Data":"a6d2f756b7f258fe70e29cf53852ae3603df9a652b028ad38830a0fd73edabde"} Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.123161 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.154866 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:21Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.183762 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d2f756b7f258fe70e29cf53852ae3603df9a652b028ad38830a0fd73edabde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5ae9cd123a33e86663401b0a8bffb6835e13283b857a173f3528ed8da869507\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:57:08Z\\\",\\\"message\\\":\\\"gmap:NhpE8g== operator.openshift.io/dep-openshift-apiserver.trusted-ca-bundle.configmap:ElMHxA==]\\\\nI1201 19:57:08.649562 6233 controller.go:257] Controller udn-host-isolation-manager: error found while processing openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h44gr: failed to check if pod openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h44gr is in primary UDN: could not find OVN pod annotation in map[openshift.io/required-scc:nonroot-v2 openshift.io/scc:nonroot-v2 seccomp.security.alpha.kubernetes.io/pod:runtime/default]\\\\nI1201 19:57:08.649698 6233 controller.go:257] Controller udn-host-isolation-manager: error found while processing openshift-machine-config-operator/machine-config-controller-84d6567774-w2f2s: failed to check if pod openshift-machine-config-operator/machine-config-controller-84d6567774-w2f2s is in primary UDN: could not find OVN pod annotation in map[openshift.io/required-scc:restricted-v2 openshift.io/scc:restricted-v2 seccomp.security.alpha.kubernetes.io/pod:runtime/default]\\\\nE1201 19:57:08.712538 6233 shared_informer.go:316] \\\\\\\"Unhandled Error\\\\\\\" err=\\\\\\\"unable to sync caches for ovn-lb-controller\\\\\\\" logger=\\\\\\\"UnhandledError\\\\\\\"\\\\nI1201 19:57:08.713800 6233 ovnkube.go:599] Stopped ovnkube\\\\nI1201 19:57:08.713874 6233 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:57:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:21Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.203185 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:21Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.213789 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.213828 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.213840 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.213856 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.213868 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:21Z","lastTransitionTime":"2025-12-01T19:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.217627 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:21Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.231269 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:21Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.242734 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:21Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.253162 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:21Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.266790 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c255f325ac4e39325605a7f8f0f99237e8efa2630bee7931972504b424d7b87a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:21Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.276787 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:21Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.286720 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:21Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.297657 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8cs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008be62d-2cef-42a3-912f-2b2e58f8e30b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8cs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:21Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.307912 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc9332ba-a2ce-493d-904b-c4f4d7499d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72a6498825f7c8427eefd172eebe1f4b8dad960f1d474c78c06f901f637ecd3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68fa39b8f5c24877fa42088699094e8bdce24f0679d8c50294f569a771789ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eaeea38dc683a9459f24ab9a0de9c3701ae217e27b425b7eed25530cb7523af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:21Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.316147 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.316180 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.316191 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.316217 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.316228 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:21Z","lastTransitionTime":"2025-12-01T19:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.323303 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:21Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.339250 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:21Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.353936 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:21Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.369497 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:21Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.379552 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.379589 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.379600 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.379616 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.379629 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:21Z","lastTransitionTime":"2025-12-01T19:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.381399 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8602c3d5a6475cdb06fb490a90a96ce1a6cf7bfbf1d3142c5b861de2838e0e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa0e17f47aef6ddde23ea47f070a3015546fd59799ea48e6618eea9f9abce64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:21Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:21 crc kubenswrapper[4802]: E1201 19:57:21.393054 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:21Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.396810 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.396861 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.396870 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.396883 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.396892 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:21Z","lastTransitionTime":"2025-12-01T19:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:21 crc kubenswrapper[4802]: E1201 19:57:21.407760 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:21Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.411746 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.411774 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.411783 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.411797 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.411807 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:21Z","lastTransitionTime":"2025-12-01T19:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:21 crc kubenswrapper[4802]: E1201 19:57:21.428253 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:21Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.432499 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.432536 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.432549 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.432568 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.432580 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:21Z","lastTransitionTime":"2025-12-01T19:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:21 crc kubenswrapper[4802]: E1201 19:57:21.445743 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:21Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.449472 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.449517 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.449530 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.449547 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.449560 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:21Z","lastTransitionTime":"2025-12-01T19:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:21 crc kubenswrapper[4802]: E1201 19:57:21.463347 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:21Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:21 crc kubenswrapper[4802]: E1201 19:57:21.463517 4802 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.465415 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.465451 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.465462 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.465477 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.465489 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:21Z","lastTransitionTime":"2025-12-01T19:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.569339 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.569411 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.569424 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.569442 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.569474 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:21Z","lastTransitionTime":"2025-12-01T19:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.672766 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.673295 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.673354 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.673462 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.673484 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:21Z","lastTransitionTime":"2025-12-01T19:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.719722 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:21 crc kubenswrapper[4802]: E1201 19:57:21.719978 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.777056 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.777143 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.777164 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.777231 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.777256 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:21Z","lastTransitionTime":"2025-12-01T19:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.880094 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.880166 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.880183 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.880230 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.880244 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:21Z","lastTransitionTime":"2025-12-01T19:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.982925 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.983038 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.983061 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.983095 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:21 crc kubenswrapper[4802]: I1201 19:57:21.983120 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:21Z","lastTransitionTime":"2025-12-01T19:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.085651 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.085691 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.085700 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.085716 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.085726 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:22Z","lastTransitionTime":"2025-12-01T19:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.127439 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7nr2_933fb25a-a01a-464e-838a-df1d07bca99e/ovnkube-controller/2.log" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.128364 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7nr2_933fb25a-a01a-464e-838a-df1d07bca99e/ovnkube-controller/1.log" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.131618 4802 generic.go:334] "Generic (PLEG): container finished" podID="933fb25a-a01a-464e-838a-df1d07bca99e" containerID="a6d2f756b7f258fe70e29cf53852ae3603df9a652b028ad38830a0fd73edabde" exitCode=1 Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.131670 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" event={"ID":"933fb25a-a01a-464e-838a-df1d07bca99e","Type":"ContainerDied","Data":"a6d2f756b7f258fe70e29cf53852ae3603df9a652b028ad38830a0fd73edabde"} Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.131726 4802 scope.go:117] "RemoveContainer" containerID="c5ae9cd123a33e86663401b0a8bffb6835e13283b857a173f3528ed8da869507" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.132735 4802 scope.go:117] "RemoveContainer" containerID="a6d2f756b7f258fe70e29cf53852ae3603df9a652b028ad38830a0fd73edabde" Dec 01 19:57:22 crc kubenswrapper[4802]: E1201 19:57:22.133021 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t7nr2_openshift-ovn-kubernetes(933fb25a-a01a-464e-838a-df1d07bca99e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.146040 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8cs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008be62d-2cef-42a3-912f-2b2e58f8e30b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8cs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:22Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.160597 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc9332ba-a2ce-493d-904b-c4f4d7499d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72a6498825f7c8427eefd172eebe1f4b8dad960f1d474c78c06f901f637ecd3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68fa39b8f5c24877fa42088699094e8bdce24f0679d8c50294f569a771789ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eaeea38dc683a9459f24ab9a0de9c3701ae217e27b425b7eed25530cb7523af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:22Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.173769 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:22Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.186447 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:22Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.188132 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.188159 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.188167 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.188181 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.188192 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:22Z","lastTransitionTime":"2025-12-01T19:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.197864 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:22Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.209700 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:22Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.228526 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:22Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.246187 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:22Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.260667 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8602c3d5a6475cdb06fb490a90a96ce1a6cf7bfbf1d3142c5b861de2838e0e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa0e17f47aef6ddde23ea47f070a3015546fd59799ea48e6618eea9f9abce64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:22Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.273877 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:22Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.285592 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:22Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.290576 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.290637 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.290649 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.290667 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.290680 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:22Z","lastTransitionTime":"2025-12-01T19:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.298958 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:22Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.327509 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d2f756b7f258fe70e29cf53852ae3603df9a652b028ad38830a0fd73edabde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5ae9cd123a33e86663401b0a8bffb6835e13283b857a173f3528ed8da869507\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:57:08Z\\\",\\\"message\\\":\\\"gmap:NhpE8g== operator.openshift.io/dep-openshift-apiserver.trusted-ca-bundle.configmap:ElMHxA==]\\\\nI1201 19:57:08.649562 6233 controller.go:257] Controller udn-host-isolation-manager: error found while processing openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h44gr: failed to check if pod openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h44gr is in primary UDN: could not find OVN pod annotation in map[openshift.io/required-scc:nonroot-v2 openshift.io/scc:nonroot-v2 seccomp.security.alpha.kubernetes.io/pod:runtime/default]\\\\nI1201 19:57:08.649698 6233 controller.go:257] Controller udn-host-isolation-manager: error found while processing openshift-machine-config-operator/machine-config-controller-84d6567774-w2f2s: failed to check if pod openshift-machine-config-operator/machine-config-controller-84d6567774-w2f2s is in primary UDN: could not find OVN pod annotation in map[openshift.io/required-scc:restricted-v2 openshift.io/scc:restricted-v2 seccomp.security.alpha.kubernetes.io/pod:runtime/default]\\\\nE1201 19:57:08.712538 6233 shared_informer.go:316] \\\\\\\"Unhandled Error\\\\\\\" err=\\\\\\\"unable to sync caches for ovn-lb-controller\\\\\\\" logger=\\\\\\\"UnhandledError\\\\\\\"\\\\nI1201 19:57:08.713800 6233 ovnkube.go:599] Stopped ovnkube\\\\nI1201 19:57:08.713874 6233 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6d2f756b7f258fe70e29cf53852ae3603df9a652b028ad38830a0fd73edabde\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"9:57:21.591577 6620 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1201 19:57:21.591574 6620 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-9wvdw after 0 failed attempt(s)\\\\nI1201 19:57:21.591594 6620 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-9wvdw\\\\nI1201 19:57:21.591563 6620 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1201 19:57:21.591603 6620 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1201 19:57:21.591477 6620 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF1201 19:57:21.591608 6620 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:57:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:22Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.341324 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:22Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.355182 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:22Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.370411 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:22Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.385272 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c255f325ac4e39325605a7f8f0f99237e8efa2630bee7931972504b424d7b87a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:22Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.393041 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.393238 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.393313 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.393413 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.393481 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:22Z","lastTransitionTime":"2025-12-01T19:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.495101 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.495141 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.495150 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.495164 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.495175 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:22Z","lastTransitionTime":"2025-12-01T19:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.597865 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.597918 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.597935 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.597959 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.597975 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:22Z","lastTransitionTime":"2025-12-01T19:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.700344 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.700401 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.700409 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.700423 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.700433 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:22Z","lastTransitionTime":"2025-12-01T19:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.719833 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.719849 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.719952 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:22 crc kubenswrapper[4802]: E1201 19:57:22.720113 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:22 crc kubenswrapper[4802]: E1201 19:57:22.720269 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:22 crc kubenswrapper[4802]: E1201 19:57:22.720421 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.803477 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.803522 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.803531 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.803546 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.803555 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:22Z","lastTransitionTime":"2025-12-01T19:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.906409 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.906450 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.906460 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.906479 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:22 crc kubenswrapper[4802]: I1201 19:57:22.906490 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:22Z","lastTransitionTime":"2025-12-01T19:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.008783 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.008844 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.008857 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.008879 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.008890 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:23Z","lastTransitionTime":"2025-12-01T19:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.110903 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.110949 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.110961 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.110977 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.110988 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:23Z","lastTransitionTime":"2025-12-01T19:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.135480 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7nr2_933fb25a-a01a-464e-838a-df1d07bca99e/ovnkube-controller/2.log" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.139376 4802 scope.go:117] "RemoveContainer" containerID="a6d2f756b7f258fe70e29cf53852ae3603df9a652b028ad38830a0fd73edabde" Dec 01 19:57:23 crc kubenswrapper[4802]: E1201 19:57:23.139566 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t7nr2_openshift-ovn-kubernetes(933fb25a-a01a-464e-838a-df1d07bca99e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.154042 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:23Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.175423 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d2f756b7f258fe70e29cf53852ae3603df9a652b028ad38830a0fd73edabde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6d2f756b7f258fe70e29cf53852ae3603df9a652b028ad38830a0fd73edabde\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"9:57:21.591577 6620 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1201 19:57:21.591574 6620 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-9wvdw after 0 failed attempt(s)\\\\nI1201 19:57:21.591594 6620 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-9wvdw\\\\nI1201 19:57:21.591563 6620 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1201 19:57:21.591603 6620 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1201 19:57:21.591477 6620 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF1201 19:57:21.591608 6620 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:57:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t7nr2_openshift-ovn-kubernetes(933fb25a-a01a-464e-838a-df1d07bca99e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:23Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.187042 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:23Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.198588 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:23Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.213403 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.213439 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.213449 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.213462 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.213471 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:23Z","lastTransitionTime":"2025-12-01T19:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.217597 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:23Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.230627 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:23Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.245267 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:23Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.268379 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c255f325ac4e39325605a7f8f0f99237e8efa2630bee7931972504b424d7b87a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:23Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.292938 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:23Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.313003 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:23Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.315566 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.315867 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.315935 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.316007 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.316063 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:23Z","lastTransitionTime":"2025-12-01T19:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.328882 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8cs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008be62d-2cef-42a3-912f-2b2e58f8e30b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8cs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:23Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.341292 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc9332ba-a2ce-493d-904b-c4f4d7499d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72a6498825f7c8427eefd172eebe1f4b8dad960f1d474c78c06f901f637ecd3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68fa39b8f5c24877fa42088699094e8bdce24f0679d8c50294f569a771789ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eaeea38dc683a9459f24ab9a0de9c3701ae217e27b425b7eed25530cb7523af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:23Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.353394 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:23Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.362473 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:23Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.372306 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:23Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.382438 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:23Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.392036 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8602c3d5a6475cdb06fb490a90a96ce1a6cf7bfbf1d3142c5b861de2838e0e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa0e17f47aef6ddde23ea47f070a3015546fd59799ea48e6618eea9f9abce64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:23Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.418110 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.418150 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.418159 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.418174 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.418183 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:23Z","lastTransitionTime":"2025-12-01T19:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.520115 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.520154 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.520163 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.520175 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.520185 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:23Z","lastTransitionTime":"2025-12-01T19:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.623268 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.623305 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.623315 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.623330 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.623339 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:23Z","lastTransitionTime":"2025-12-01T19:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.719424 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:23 crc kubenswrapper[4802]: E1201 19:57:23.720000 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.726351 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.726382 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.726391 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.726403 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.726412 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:23Z","lastTransitionTime":"2025-12-01T19:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.828888 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.828957 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.828983 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.829007 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.829023 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:23Z","lastTransitionTime":"2025-12-01T19:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.931591 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.931624 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.931634 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.931650 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:23 crc kubenswrapper[4802]: I1201 19:57:23.931661 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:23Z","lastTransitionTime":"2025-12-01T19:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.034109 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.034150 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.034163 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.034179 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.034211 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:24Z","lastTransitionTime":"2025-12-01T19:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.137115 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.137232 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.137258 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.137282 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.137302 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:24Z","lastTransitionTime":"2025-12-01T19:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.239801 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.239864 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.239888 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.239918 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.239946 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:24Z","lastTransitionTime":"2025-12-01T19:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.342492 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.342568 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.342595 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.342624 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.342647 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:24Z","lastTransitionTime":"2025-12-01T19:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.444816 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.444852 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.444870 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.444894 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.444911 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:24Z","lastTransitionTime":"2025-12-01T19:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.548115 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.548180 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.548190 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.548408 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.548419 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:24Z","lastTransitionTime":"2025-12-01T19:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.651499 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.651541 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.651553 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.651569 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.651579 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:24Z","lastTransitionTime":"2025-12-01T19:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.719291 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.719320 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:24 crc kubenswrapper[4802]: E1201 19:57:24.719395 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:24 crc kubenswrapper[4802]: E1201 19:57:24.719448 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.719988 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:24 crc kubenswrapper[4802]: E1201 19:57:24.720125 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.753491 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.753686 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.753757 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.753877 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.753985 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:24Z","lastTransitionTime":"2025-12-01T19:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.766048 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008be62d-2cef-42a3-912f-2b2e58f8e30b-metrics-certs\") pod \"network-metrics-daemon-p8cs7\" (UID: \"008be62d-2cef-42a3-912f-2b2e58f8e30b\") " pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:24 crc kubenswrapper[4802]: E1201 19:57:24.766228 4802 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 19:57:24 crc kubenswrapper[4802]: E1201 19:57:24.766290 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/008be62d-2cef-42a3-912f-2b2e58f8e30b-metrics-certs podName:008be62d-2cef-42a3-912f-2b2e58f8e30b nodeName:}" failed. No retries permitted until 2025-12-01 19:57:56.766274432 +0000 UTC m=+98.328834083 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/008be62d-2cef-42a3-912f-2b2e58f8e30b-metrics-certs") pod "network-metrics-daemon-p8cs7" (UID: "008be62d-2cef-42a3-912f-2b2e58f8e30b") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.856079 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.856120 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.856129 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.856145 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.856156 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:24Z","lastTransitionTime":"2025-12-01T19:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.958143 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.958187 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.958214 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.958234 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:24 crc kubenswrapper[4802]: I1201 19:57:24.958247 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:24Z","lastTransitionTime":"2025-12-01T19:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.060466 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.060505 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.060515 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.060530 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.060541 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:25Z","lastTransitionTime":"2025-12-01T19:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.145559 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8zl28_bd82ca15-4489-4c15-aaf0-afb6b6787dc6/kube-multus/0.log" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.145620 4802 generic.go:334] "Generic (PLEG): container finished" podID="bd82ca15-4489-4c15-aaf0-afb6b6787dc6" containerID="f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3" exitCode=1 Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.145658 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8zl28" event={"ID":"bd82ca15-4489-4c15-aaf0-afb6b6787dc6","Type":"ContainerDied","Data":"f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3"} Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.146138 4802 scope.go:117] "RemoveContainer" containerID="f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.159429 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:25Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.163247 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.163280 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.163291 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.163308 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.163318 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:25Z","lastTransitionTime":"2025-12-01T19:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.175456 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:25Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.184987 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8cs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008be62d-2cef-42a3-912f-2b2e58f8e30b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8cs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:25Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.195467 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc9332ba-a2ce-493d-904b-c4f4d7499d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72a6498825f7c8427eefd172eebe1f4b8dad960f1d474c78c06f901f637ecd3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68fa39b8f5c24877fa42088699094e8bdce24f0679d8c50294f569a771789ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eaeea38dc683a9459f24ab9a0de9c3701ae217e27b425b7eed25530cb7523af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:25Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.208083 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:25Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.225271 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:25Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.241401 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:25Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.253516 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:25Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.263758 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8602c3d5a6475cdb06fb490a90a96ce1a6cf7bfbf1d3142c5b861de2838e0e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa0e17f47aef6ddde23ea47f070a3015546fd59799ea48e6618eea9f9abce64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:25Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.264991 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.265017 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.265026 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.265042 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.265051 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:25Z","lastTransitionTime":"2025-12-01T19:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.281877 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:57:24Z\\\",\\\"message\\\":\\\"2025-12-01T19:56:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ef246b4d-8741-447d-95fc-3e667c36d7f4\\\\n2025-12-01T19:56:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ef246b4d-8741-447d-95fc-3e667c36d7f4 to /host/opt/cni/bin/\\\\n2025-12-01T19:56:39Z [verbose] multus-daemon started\\\\n2025-12-01T19:56:39Z [verbose] Readiness Indicator file check\\\\n2025-12-01T19:57:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:25Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.298279 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d2f756b7f258fe70e29cf53852ae3603df9a652b028ad38830a0fd73edabde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6d2f756b7f258fe70e29cf53852ae3603df9a652b028ad38830a0fd73edabde\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"9:57:21.591577 6620 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1201 19:57:21.591574 6620 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-9wvdw after 0 failed attempt(s)\\\\nI1201 19:57:21.591594 6620 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-9wvdw\\\\nI1201 19:57:21.591563 6620 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1201 19:57:21.591603 6620 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1201 19:57:21.591477 6620 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF1201 19:57:21.591608 6620 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:57:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t7nr2_openshift-ovn-kubernetes(933fb25a-a01a-464e-838a-df1d07bca99e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:25Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.315367 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:25Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.327459 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:25Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.339846 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:25Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.349426 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:25Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.358596 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:25Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.366974 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.366998 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.367008 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.367020 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.367028 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:25Z","lastTransitionTime":"2025-12-01T19:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.379599 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c255f325ac4e39325605a7f8f0f99237e8efa2630bee7931972504b424d7b87a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:25Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.469421 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.469453 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.469461 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.469473 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.469482 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:25Z","lastTransitionTime":"2025-12-01T19:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.571756 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.572050 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.572278 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.572438 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.572530 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:25Z","lastTransitionTime":"2025-12-01T19:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.674749 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.674784 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.674792 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.674804 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.674812 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:25Z","lastTransitionTime":"2025-12-01T19:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.719579 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:25 crc kubenswrapper[4802]: E1201 19:57:25.719732 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.777392 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.777462 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.777485 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.777513 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.777537 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:25Z","lastTransitionTime":"2025-12-01T19:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.880598 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.880644 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.880660 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.880684 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.880701 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:25Z","lastTransitionTime":"2025-12-01T19:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.983135 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.983168 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.983178 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.983208 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:25 crc kubenswrapper[4802]: I1201 19:57:25.983220 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:25Z","lastTransitionTime":"2025-12-01T19:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.086280 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.086359 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.086385 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.086414 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.086436 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:26Z","lastTransitionTime":"2025-12-01T19:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.151155 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8zl28_bd82ca15-4489-4c15-aaf0-afb6b6787dc6/kube-multus/0.log" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.151309 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8zl28" event={"ID":"bd82ca15-4489-4c15-aaf0-afb6b6787dc6","Type":"ContainerStarted","Data":"8bca0d5f90fcacce4e41c5c25c3297ad735feaac885581866866378106ae6d64"} Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.170933 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:26Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.185995 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8602c3d5a6475cdb06fb490a90a96ce1a6cf7bfbf1d3142c5b861de2838e0e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa0e17f47aef6ddde23ea47f070a3015546fd59799ea48e6618eea9f9abce64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:26Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.188830 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.188904 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.188931 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.188961 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.188988 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:26Z","lastTransitionTime":"2025-12-01T19:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.207303 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bca0d5f90fcacce4e41c5c25c3297ad735feaac885581866866378106ae6d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:57:24Z\\\",\\\"message\\\":\\\"2025-12-01T19:56:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ef246b4d-8741-447d-95fc-3e667c36d7f4\\\\n2025-12-01T19:56:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ef246b4d-8741-447d-95fc-3e667c36d7f4 to /host/opt/cni/bin/\\\\n2025-12-01T19:56:39Z [verbose] multus-daemon started\\\\n2025-12-01T19:56:39Z [verbose] Readiness Indicator file check\\\\n2025-12-01T19:57:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:26Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.227518 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d2f756b7f258fe70e29cf53852ae3603df9a652b028ad38830a0fd73edabde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6d2f756b7f258fe70e29cf53852ae3603df9a652b028ad38830a0fd73edabde\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"9:57:21.591577 6620 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1201 19:57:21.591574 6620 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-9wvdw after 0 failed attempt(s)\\\\nI1201 19:57:21.591594 6620 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-9wvdw\\\\nI1201 19:57:21.591563 6620 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1201 19:57:21.591603 6620 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1201 19:57:21.591477 6620 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF1201 19:57:21.591608 6620 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:57:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t7nr2_openshift-ovn-kubernetes(933fb25a-a01a-464e-838a-df1d07bca99e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:26Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.239866 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:26Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.254466 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:26Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.268682 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:26Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.286493 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:26Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.290782 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.290857 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.290878 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.290906 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.290923 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:26Z","lastTransitionTime":"2025-12-01T19:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.300772 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:26Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.318942 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c255f325ac4e39325605a7f8f0f99237e8efa2630bee7931972504b424d7b87a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:26Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.330768 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:26Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.343621 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:26Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.355411 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8cs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008be62d-2cef-42a3-912f-2b2e58f8e30b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8cs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:26Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.367466 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc9332ba-a2ce-493d-904b-c4f4d7499d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72a6498825f7c8427eefd172eebe1f4b8dad960f1d474c78c06f901f637ecd3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68fa39b8f5c24877fa42088699094e8bdce24f0679d8c50294f569a771789ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eaeea38dc683a9459f24ab9a0de9c3701ae217e27b425b7eed25530cb7523af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:26Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.380096 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:26Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.393675 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.393728 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.393741 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.393758 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.394133 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:26Z","lastTransitionTime":"2025-12-01T19:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.396135 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:26Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.412781 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:26Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.496077 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.496115 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.496127 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.496142 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.496154 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:26Z","lastTransitionTime":"2025-12-01T19:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.598554 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.598600 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.598612 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.598629 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.598639 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:26Z","lastTransitionTime":"2025-12-01T19:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.701262 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.701300 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.701310 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.701322 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.701330 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:26Z","lastTransitionTime":"2025-12-01T19:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.719693 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.719817 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:26 crc kubenswrapper[4802]: E1201 19:57:26.719932 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.720011 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:26 crc kubenswrapper[4802]: E1201 19:57:26.720168 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:26 crc kubenswrapper[4802]: E1201 19:57:26.720306 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.807318 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.807399 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.807416 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.807439 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.807458 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:26Z","lastTransitionTime":"2025-12-01T19:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.910447 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.910486 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.910497 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.910511 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:26 crc kubenswrapper[4802]: I1201 19:57:26.910520 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:26Z","lastTransitionTime":"2025-12-01T19:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.012876 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.012927 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.012942 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.012962 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.012975 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:27Z","lastTransitionTime":"2025-12-01T19:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.115715 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.115770 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.115779 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.115793 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.115803 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:27Z","lastTransitionTime":"2025-12-01T19:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.217731 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.217777 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.217789 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.217809 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.217822 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:27Z","lastTransitionTime":"2025-12-01T19:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.319966 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.320009 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.320020 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.320036 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.320044 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:27Z","lastTransitionTime":"2025-12-01T19:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.421994 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.422049 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.422061 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.422077 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.422089 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:27Z","lastTransitionTime":"2025-12-01T19:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.524028 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.524063 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.524076 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.524091 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.524102 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:27Z","lastTransitionTime":"2025-12-01T19:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.627122 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.627160 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.627174 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.627191 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.627227 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:27Z","lastTransitionTime":"2025-12-01T19:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.719670 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:27 crc kubenswrapper[4802]: E1201 19:57:27.719803 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.729364 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.729425 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.729441 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.729469 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.729521 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:27Z","lastTransitionTime":"2025-12-01T19:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.834374 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.834447 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.834459 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.834485 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.834499 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:27Z","lastTransitionTime":"2025-12-01T19:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.938430 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.938488 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.938500 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.938520 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:27 crc kubenswrapper[4802]: I1201 19:57:27.938532 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:27Z","lastTransitionTime":"2025-12-01T19:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.041089 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.041138 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.041149 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.041168 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.041181 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:28Z","lastTransitionTime":"2025-12-01T19:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.144549 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.144596 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.144606 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.144623 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.144634 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:28Z","lastTransitionTime":"2025-12-01T19:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.248127 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.248192 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.248252 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.248282 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.248303 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:28Z","lastTransitionTime":"2025-12-01T19:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.351085 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.351658 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.351812 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.351960 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.352129 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:28Z","lastTransitionTime":"2025-12-01T19:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.455389 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.455456 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.455474 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.455500 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.455519 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:28Z","lastTransitionTime":"2025-12-01T19:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.558622 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.558920 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.558986 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.559054 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.559115 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:28Z","lastTransitionTime":"2025-12-01T19:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.662395 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.662439 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.662450 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.662467 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.662481 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:28Z","lastTransitionTime":"2025-12-01T19:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.719135 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.719145 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:28 crc kubenswrapper[4802]: E1201 19:57:28.719398 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.719453 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:28 crc kubenswrapper[4802]: E1201 19:57:28.719606 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:28 crc kubenswrapper[4802]: E1201 19:57:28.719725 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.736066 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:28Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.748322 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8602c3d5a6475cdb06fb490a90a96ce1a6cf7bfbf1d3142c5b861de2838e0e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa0e17f47aef6ddde23ea47f070a3015546fd59799ea48e6618eea9f9abce64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:28Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.765444 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.765500 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.765510 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.765527 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.765553 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:28Z","lastTransitionTime":"2025-12-01T19:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.768419 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:28Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.783364 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:28Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.803557 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bca0d5f90fcacce4e41c5c25c3297ad735feaac885581866866378106ae6d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:57:24Z\\\",\\\"message\\\":\\\"2025-12-01T19:56:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ef246b4d-8741-447d-95fc-3e667c36d7f4\\\\n2025-12-01T19:56:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ef246b4d-8741-447d-95fc-3e667c36d7f4 to /host/opt/cni/bin/\\\\n2025-12-01T19:56:39Z [verbose] multus-daemon started\\\\n2025-12-01T19:56:39Z [verbose] Readiness Indicator file check\\\\n2025-12-01T19:57:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:28Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.840471 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d2f756b7f258fe70e29cf53852ae3603df9a652b028ad38830a0fd73edabde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6d2f756b7f258fe70e29cf53852ae3603df9a652b028ad38830a0fd73edabde\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"9:57:21.591577 6620 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1201 19:57:21.591574 6620 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-9wvdw after 0 failed attempt(s)\\\\nI1201 19:57:21.591594 6620 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-9wvdw\\\\nI1201 19:57:21.591563 6620 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1201 19:57:21.591603 6620 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1201 19:57:21.591477 6620 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF1201 19:57:21.591608 6620 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:57:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t7nr2_openshift-ovn-kubernetes(933fb25a-a01a-464e-838a-df1d07bca99e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:28Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.864268 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:28Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.872092 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.872150 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.872169 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.872227 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.872259 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:28Z","lastTransitionTime":"2025-12-01T19:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.885677 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:28Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.901264 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:28Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.915295 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c255f325ac4e39325605a7f8f0f99237e8efa2630bee7931972504b424d7b87a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:28Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.929046 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8cs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008be62d-2cef-42a3-912f-2b2e58f8e30b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8cs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:28Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.946741 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc9332ba-a2ce-493d-904b-c4f4d7499d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72a6498825f7c8427eefd172eebe1f4b8dad960f1d474c78c06f901f637ecd3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68fa39b8f5c24877fa42088699094e8bdce24f0679d8c50294f569a771789ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eaeea38dc683a9459f24ab9a0de9c3701ae217e27b425b7eed25530cb7523af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:28Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.966450 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:28Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.974294 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.974335 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.974350 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.974370 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.974383 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:28Z","lastTransitionTime":"2025-12-01T19:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:28 crc kubenswrapper[4802]: I1201 19:57:28.982781 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:28Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.002256 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:28Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.017911 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:29Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.030102 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:29Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.077239 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.077533 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.077621 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.077710 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.077811 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:29Z","lastTransitionTime":"2025-12-01T19:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.181422 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.181489 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.181506 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.181536 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.181557 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:29Z","lastTransitionTime":"2025-12-01T19:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.284638 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.284715 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.284736 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.284767 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.284790 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:29Z","lastTransitionTime":"2025-12-01T19:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.387211 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.387500 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.387703 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.387840 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.387931 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:29Z","lastTransitionTime":"2025-12-01T19:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.490125 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.490166 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.490179 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.490211 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.490221 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:29Z","lastTransitionTime":"2025-12-01T19:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.593446 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.593495 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.593507 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.593524 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.593537 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:29Z","lastTransitionTime":"2025-12-01T19:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.696179 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.696294 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.696311 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.696335 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.696354 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:29Z","lastTransitionTime":"2025-12-01T19:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.719034 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:29 crc kubenswrapper[4802]: E1201 19:57:29.719190 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.798637 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.800079 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.800280 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.800454 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.800603 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:29Z","lastTransitionTime":"2025-12-01T19:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.903180 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.903247 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.903260 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.903280 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:29 crc kubenswrapper[4802]: I1201 19:57:29.903296 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:29Z","lastTransitionTime":"2025-12-01T19:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.005837 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.005880 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.005892 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.005908 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.005919 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:30Z","lastTransitionTime":"2025-12-01T19:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.108611 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.108658 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.108676 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.108716 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.108730 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:30Z","lastTransitionTime":"2025-12-01T19:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.211370 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.211411 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.211420 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.211437 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.211446 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:30Z","lastTransitionTime":"2025-12-01T19:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.314305 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.314391 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.314406 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.314423 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.314436 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:30Z","lastTransitionTime":"2025-12-01T19:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.416450 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.416488 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.416501 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.416517 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.416528 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:30Z","lastTransitionTime":"2025-12-01T19:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.519045 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.519076 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.519084 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.519096 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.519105 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:30Z","lastTransitionTime":"2025-12-01T19:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.623295 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.623353 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.623366 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.623386 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.623406 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:30Z","lastTransitionTime":"2025-12-01T19:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.719806 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.719819 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:30 crc kubenswrapper[4802]: E1201 19:57:30.719947 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:30 crc kubenswrapper[4802]: E1201 19:57:30.719999 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.719839 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:30 crc kubenswrapper[4802]: E1201 19:57:30.720077 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.725777 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.725812 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.725821 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.725832 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.725843 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:30Z","lastTransitionTime":"2025-12-01T19:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.828184 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.828259 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.828272 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.828312 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.828325 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:30Z","lastTransitionTime":"2025-12-01T19:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.930627 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.930701 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.930715 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.930732 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:30 crc kubenswrapper[4802]: I1201 19:57:30.930745 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:30Z","lastTransitionTime":"2025-12-01T19:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.033018 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.033068 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.033080 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.033099 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.033111 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:31Z","lastTransitionTime":"2025-12-01T19:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.135305 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.135357 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.135369 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.135385 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.135395 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:31Z","lastTransitionTime":"2025-12-01T19:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.236972 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.237020 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.237032 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.237046 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.237056 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:31Z","lastTransitionTime":"2025-12-01T19:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.339054 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.339079 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.339087 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.339100 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.339109 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:31Z","lastTransitionTime":"2025-12-01T19:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.441875 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.441914 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.441922 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.441937 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.441945 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:31Z","lastTransitionTime":"2025-12-01T19:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.544153 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.544227 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.544243 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.544261 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.544272 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:31Z","lastTransitionTime":"2025-12-01T19:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.646144 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.646218 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.646227 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.646245 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.646255 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:31Z","lastTransitionTime":"2025-12-01T19:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.719838 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:31 crc kubenswrapper[4802]: E1201 19:57:31.719979 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.748403 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.748484 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.748499 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.748518 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.748531 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:31Z","lastTransitionTime":"2025-12-01T19:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.813504 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.813537 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.813548 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.813561 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.813570 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:31Z","lastTransitionTime":"2025-12-01T19:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:31 crc kubenswrapper[4802]: E1201 19:57:31.826146 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:31Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.829448 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.829479 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.829490 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.829505 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.829516 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:31Z","lastTransitionTime":"2025-12-01T19:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:31 crc kubenswrapper[4802]: E1201 19:57:31.841404 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:31Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.844698 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.844719 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.844729 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.844760 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.844770 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:31Z","lastTransitionTime":"2025-12-01T19:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:31 crc kubenswrapper[4802]: E1201 19:57:31.854877 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:31Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.857814 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.857856 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.857888 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.857908 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.857920 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:31Z","lastTransitionTime":"2025-12-01T19:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:31 crc kubenswrapper[4802]: E1201 19:57:31.871391 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:31Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.874166 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.874224 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.874232 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.874244 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.874251 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:31Z","lastTransitionTime":"2025-12-01T19:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:31 crc kubenswrapper[4802]: E1201 19:57:31.884657 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:31Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:31 crc kubenswrapper[4802]: E1201 19:57:31.884849 4802 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.886399 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.886425 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.886433 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.886444 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.886452 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:31Z","lastTransitionTime":"2025-12-01T19:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.989434 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.989493 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.989511 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.989533 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:31 crc kubenswrapper[4802]: I1201 19:57:31.989549 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:31Z","lastTransitionTime":"2025-12-01T19:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.092292 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.092342 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.092354 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.092373 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.092388 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:32Z","lastTransitionTime":"2025-12-01T19:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.195522 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.195561 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.195571 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.195589 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.195598 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:32Z","lastTransitionTime":"2025-12-01T19:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.298113 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.298155 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.298167 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.298186 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.298223 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:32Z","lastTransitionTime":"2025-12-01T19:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.400749 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.400794 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.400809 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.400824 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.400833 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:32Z","lastTransitionTime":"2025-12-01T19:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.502954 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.502996 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.503007 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.503025 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.503037 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:32Z","lastTransitionTime":"2025-12-01T19:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.605678 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.605709 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.605717 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.605732 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.605741 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:32Z","lastTransitionTime":"2025-12-01T19:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.707572 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.707629 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.707642 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.707659 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.707671 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:32Z","lastTransitionTime":"2025-12-01T19:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.719260 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.719340 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:32 crc kubenswrapper[4802]: E1201 19:57:32.719362 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.719260 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:32 crc kubenswrapper[4802]: E1201 19:57:32.719470 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:32 crc kubenswrapper[4802]: E1201 19:57:32.719555 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.810651 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.810691 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.810702 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.810717 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.810727 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:32Z","lastTransitionTime":"2025-12-01T19:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.913774 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.913825 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.913843 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.913865 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:32 crc kubenswrapper[4802]: I1201 19:57:32.913882 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:32Z","lastTransitionTime":"2025-12-01T19:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.016871 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.016909 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.016920 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.016938 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.016949 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:33Z","lastTransitionTime":"2025-12-01T19:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.119421 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.119462 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.119471 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.119486 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.119496 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:33Z","lastTransitionTime":"2025-12-01T19:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.221226 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.221279 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.221298 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.221326 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.221345 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:33Z","lastTransitionTime":"2025-12-01T19:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.323893 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.323924 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.323931 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.323946 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.323978 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:33Z","lastTransitionTime":"2025-12-01T19:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.427633 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.427687 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.427697 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.427710 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.427718 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:33Z","lastTransitionTime":"2025-12-01T19:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.530279 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.530349 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.530359 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.530393 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.530403 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:33Z","lastTransitionTime":"2025-12-01T19:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.633023 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.633060 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.633074 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.633090 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.633102 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:33Z","lastTransitionTime":"2025-12-01T19:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.719711 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:33 crc kubenswrapper[4802]: E1201 19:57:33.719863 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.735373 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.735427 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.735444 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.735466 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.735484 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:33Z","lastTransitionTime":"2025-12-01T19:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.837902 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.837940 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.837951 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.837967 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.837978 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:33Z","lastTransitionTime":"2025-12-01T19:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.940427 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.940476 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.940489 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.940507 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:33 crc kubenswrapper[4802]: I1201 19:57:33.940519 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:33Z","lastTransitionTime":"2025-12-01T19:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.042978 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.043033 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.043050 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.043071 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.043087 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:34Z","lastTransitionTime":"2025-12-01T19:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.146493 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.146538 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.146554 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.146573 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.146587 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:34Z","lastTransitionTime":"2025-12-01T19:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.248663 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.248697 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.248706 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.248719 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.248730 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:34Z","lastTransitionTime":"2025-12-01T19:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.351164 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.351221 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.351232 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.351247 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.351257 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:34Z","lastTransitionTime":"2025-12-01T19:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.453719 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.453776 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.453800 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.453823 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.453837 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:34Z","lastTransitionTime":"2025-12-01T19:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.556152 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.556189 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.556211 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.556225 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.556235 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:34Z","lastTransitionTime":"2025-12-01T19:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.658717 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.658773 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.658789 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.658816 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.658831 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:34Z","lastTransitionTime":"2025-12-01T19:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.719683 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.719683 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.719954 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:34 crc kubenswrapper[4802]: E1201 19:57:34.719840 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:34 crc kubenswrapper[4802]: E1201 19:57:34.720154 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:34 crc kubenswrapper[4802]: E1201 19:57:34.720345 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.764968 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.765068 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.765089 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.765115 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.765132 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:34Z","lastTransitionTime":"2025-12-01T19:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.868338 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.868393 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.868411 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.868434 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.868450 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:34Z","lastTransitionTime":"2025-12-01T19:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.971489 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.971525 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.971536 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.971550 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:34 crc kubenswrapper[4802]: I1201 19:57:34.971561 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:34Z","lastTransitionTime":"2025-12-01T19:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.074048 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.074100 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.074116 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.074138 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.074156 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:35Z","lastTransitionTime":"2025-12-01T19:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.177080 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.177145 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.177174 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.177235 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.177258 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:35Z","lastTransitionTime":"2025-12-01T19:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.279688 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.279737 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.279754 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.279777 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.279794 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:35Z","lastTransitionTime":"2025-12-01T19:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.383377 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.383430 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.383449 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.383473 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.383491 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:35Z","lastTransitionTime":"2025-12-01T19:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.487179 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.487293 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.487354 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.487380 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.487397 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:35Z","lastTransitionTime":"2025-12-01T19:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.589636 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.589699 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.589716 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.589743 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.589761 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:35Z","lastTransitionTime":"2025-12-01T19:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.693149 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.693231 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.693251 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.693274 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.693290 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:35Z","lastTransitionTime":"2025-12-01T19:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.719900 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:35 crc kubenswrapper[4802]: E1201 19:57:35.720084 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.796562 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.796677 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.796703 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.796788 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.796855 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:35Z","lastTransitionTime":"2025-12-01T19:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.900782 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.900841 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.900860 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.900884 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:35 crc kubenswrapper[4802]: I1201 19:57:35.900901 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:35Z","lastTransitionTime":"2025-12-01T19:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.004398 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.004480 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.004509 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.004540 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.004564 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:36Z","lastTransitionTime":"2025-12-01T19:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.106804 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.106874 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.106886 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.106928 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.106944 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:36Z","lastTransitionTime":"2025-12-01T19:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.210364 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.210462 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.210481 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.210538 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.211063 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:36Z","lastTransitionTime":"2025-12-01T19:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.314814 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.314975 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.315000 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.315060 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.315082 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:36Z","lastTransitionTime":"2025-12-01T19:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.418935 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.419020 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.419050 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.419089 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.419117 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:36Z","lastTransitionTime":"2025-12-01T19:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.521398 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.521464 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.521481 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.521506 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.521524 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:36Z","lastTransitionTime":"2025-12-01T19:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.624245 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.624290 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.624301 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.624316 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.624329 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:36Z","lastTransitionTime":"2025-12-01T19:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.719979 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.720072 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:36 crc kubenswrapper[4802]: E1201 19:57:36.720150 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.720074 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:36 crc kubenswrapper[4802]: E1201 19:57:36.720315 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:36 crc kubenswrapper[4802]: E1201 19:57:36.720444 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.726608 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.726645 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.726686 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.726703 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.726714 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:36Z","lastTransitionTime":"2025-12-01T19:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.829112 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.829161 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.829177 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.829229 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.829247 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:36Z","lastTransitionTime":"2025-12-01T19:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.932485 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.932531 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.932547 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.932570 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:36 crc kubenswrapper[4802]: I1201 19:57:36.932586 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:36Z","lastTransitionTime":"2025-12-01T19:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.036182 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.036266 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.036284 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.036309 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.036327 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:37Z","lastTransitionTime":"2025-12-01T19:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.139668 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.139724 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.139741 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.139771 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.139788 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:37Z","lastTransitionTime":"2025-12-01T19:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.243019 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.243079 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.243099 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.243126 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.243144 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:37Z","lastTransitionTime":"2025-12-01T19:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.346267 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.346328 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.346346 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.346370 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.346388 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:37Z","lastTransitionTime":"2025-12-01T19:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.448910 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.448986 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.449009 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.449038 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.449059 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:37Z","lastTransitionTime":"2025-12-01T19:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.551665 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.552030 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.552051 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.552077 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.552094 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:37Z","lastTransitionTime":"2025-12-01T19:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.655110 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.655151 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.655163 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.655177 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.655187 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:37Z","lastTransitionTime":"2025-12-01T19:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.719523 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:37 crc kubenswrapper[4802]: E1201 19:57:37.719744 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.722724 4802 scope.go:117] "RemoveContainer" containerID="a6d2f756b7f258fe70e29cf53852ae3603df9a652b028ad38830a0fd73edabde" Dec 01 19:57:37 crc kubenswrapper[4802]: E1201 19:57:37.722996 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t7nr2_openshift-ovn-kubernetes(933fb25a-a01a-464e-838a-df1d07bca99e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.758791 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.758840 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.758852 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.758868 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.758881 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:37Z","lastTransitionTime":"2025-12-01T19:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.861388 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.861464 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.861488 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.861518 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.861542 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:37Z","lastTransitionTime":"2025-12-01T19:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.964891 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.965058 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.965086 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.965118 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:37 crc kubenswrapper[4802]: I1201 19:57:37.965143 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:37Z","lastTransitionTime":"2025-12-01T19:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.068054 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.068134 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.068164 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.068228 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.068248 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:38Z","lastTransitionTime":"2025-12-01T19:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.170674 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.170716 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.170730 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.170748 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.170759 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:38Z","lastTransitionTime":"2025-12-01T19:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.273081 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.273144 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.273158 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.273177 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.273190 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:38Z","lastTransitionTime":"2025-12-01T19:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.375886 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.375943 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.375961 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.375984 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.376002 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:38Z","lastTransitionTime":"2025-12-01T19:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.478573 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.478638 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.478658 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.478686 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.478704 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:38Z","lastTransitionTime":"2025-12-01T19:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.581869 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.581911 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.581924 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.581939 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.581951 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:38Z","lastTransitionTime":"2025-12-01T19:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.683520 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.683576 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.683586 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.683598 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.683606 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:38Z","lastTransitionTime":"2025-12-01T19:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.719305 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.719372 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:38 crc kubenswrapper[4802]: E1201 19:57:38.719415 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.719392 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:38 crc kubenswrapper[4802]: E1201 19:57:38.719714 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:38 crc kubenswrapper[4802]: E1201 19:57:38.719645 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.751844 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d2f756b7f258fe70e29cf53852ae3603df9a652b028ad38830a0fd73edabde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6d2f756b7f258fe70e29cf53852ae3603df9a652b028ad38830a0fd73edabde\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"9:57:21.591577 6620 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1201 19:57:21.591574 6620 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-9wvdw after 0 failed attempt(s)\\\\nI1201 19:57:21.591594 6620 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-9wvdw\\\\nI1201 19:57:21.591563 6620 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1201 19:57:21.591603 6620 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1201 19:57:21.591477 6620 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF1201 19:57:21.591608 6620 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:57:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t7nr2_openshift-ovn-kubernetes(933fb25a-a01a-464e-838a-df1d07bca99e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.771870 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.786965 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.787043 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.787068 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.787097 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.787120 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:38Z","lastTransitionTime":"2025-12-01T19:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.787682 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.803897 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bca0d5f90fcacce4e41c5c25c3297ad735feaac885581866866378106ae6d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:57:24Z\\\",\\\"message\\\":\\\"2025-12-01T19:56:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ef246b4d-8741-447d-95fc-3e667c36d7f4\\\\n2025-12-01T19:56:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ef246b4d-8741-447d-95fc-3e667c36d7f4 to /host/opt/cni/bin/\\\\n2025-12-01T19:56:39Z [verbose] multus-daemon started\\\\n2025-12-01T19:56:39Z [verbose] Readiness Indicator file check\\\\n2025-12-01T19:57:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.832013 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.851169 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.865725 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.886030 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c255f325ac4e39325605a7f8f0f99237e8efa2630bee7931972504b424d7b87a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.892332 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.892361 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.892371 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.892391 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.892401 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:38Z","lastTransitionTime":"2025-12-01T19:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.904805 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.920633 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8cs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008be62d-2cef-42a3-912f-2b2e58f8e30b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8cs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.937350 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc9332ba-a2ce-493d-904b-c4f4d7499d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72a6498825f7c8427eefd172eebe1f4b8dad960f1d474c78c06f901f637ecd3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68fa39b8f5c24877fa42088699094e8bdce24f0679d8c50294f569a771789ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eaeea38dc683a9459f24ab9a0de9c3701ae217e27b425b7eed25530cb7523af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.950147 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.964353 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.978768 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.989691 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:38Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.994918 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.994964 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.994976 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.994994 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:38 crc kubenswrapper[4802]: I1201 19:57:38.995007 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:38Z","lastTransitionTime":"2025-12-01T19:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.002492 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:39Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.017130 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8602c3d5a6475cdb06fb490a90a96ce1a6cf7bfbf1d3142c5b861de2838e0e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa0e17f47aef6ddde23ea47f070a3015546fd59799ea48e6618eea9f9abce64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:39Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.098355 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.098413 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.098430 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.098453 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.098472 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:39Z","lastTransitionTime":"2025-12-01T19:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.200688 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.200732 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.200744 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.200760 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.200770 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:39Z","lastTransitionTime":"2025-12-01T19:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.302950 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.302985 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.303013 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.303027 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.303036 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:39Z","lastTransitionTime":"2025-12-01T19:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.406297 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.406353 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.406370 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.406397 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.406415 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:39Z","lastTransitionTime":"2025-12-01T19:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.509348 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.509398 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.509415 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.509440 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.509458 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:39Z","lastTransitionTime":"2025-12-01T19:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.612472 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.612532 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.612549 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.612577 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.612595 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:39Z","lastTransitionTime":"2025-12-01T19:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.715115 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.715167 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.715177 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.715191 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.715244 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:39Z","lastTransitionTime":"2025-12-01T19:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.719152 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:39 crc kubenswrapper[4802]: E1201 19:57:39.719361 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.818101 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.818132 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.818141 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.818155 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.818163 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:39Z","lastTransitionTime":"2025-12-01T19:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.920138 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.920249 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.920268 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.920290 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:39 crc kubenswrapper[4802]: I1201 19:57:39.920309 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:39Z","lastTransitionTime":"2025-12-01T19:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.022977 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.023021 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.023032 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.023050 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.023062 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:40Z","lastTransitionTime":"2025-12-01T19:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.124601 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.124629 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.124638 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.124651 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.124660 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:40Z","lastTransitionTime":"2025-12-01T19:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.227534 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.227608 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.227633 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.227665 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.227695 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:40Z","lastTransitionTime":"2025-12-01T19:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.330957 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.331026 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.331048 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.331079 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.331100 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:40Z","lastTransitionTime":"2025-12-01T19:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.434048 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.434136 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.434173 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.434259 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.434300 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:40Z","lastTransitionTime":"2025-12-01T19:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.524886 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.525033 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:40 crc kubenswrapper[4802]: E1201 19:57:40.525107 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:58:44.525081832 +0000 UTC m=+146.087641493 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:57:40 crc kubenswrapper[4802]: E1201 19:57:40.525155 4802 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 19:57:40 crc kubenswrapper[4802]: E1201 19:57:40.525292 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 19:58:44.525263978 +0000 UTC m=+146.087823659 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.525154 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:40 crc kubenswrapper[4802]: E1201 19:57:40.525487 4802 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 19:57:40 crc kubenswrapper[4802]: E1201 19:57:40.525546 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 19:58:44.525528626 +0000 UTC m=+146.088088287 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.536928 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.536964 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.536975 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.536991 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.537002 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:40Z","lastTransitionTime":"2025-12-01T19:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.626771 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.626859 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:40 crc kubenswrapper[4802]: E1201 19:57:40.627021 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 19:57:40 crc kubenswrapper[4802]: E1201 19:57:40.627047 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 19:57:40 crc kubenswrapper[4802]: E1201 19:57:40.627067 4802 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:57:40 crc kubenswrapper[4802]: E1201 19:57:40.627141 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 19:58:44.627120092 +0000 UTC m=+146.189679773 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:57:40 crc kubenswrapper[4802]: E1201 19:57:40.627182 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 19:57:40 crc kubenswrapper[4802]: E1201 19:57:40.627268 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 19:57:40 crc kubenswrapper[4802]: E1201 19:57:40.627291 4802 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:57:40 crc kubenswrapper[4802]: E1201 19:57:40.627387 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 19:58:44.62735975 +0000 UTC m=+146.189919431 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.639703 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.639773 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.639796 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.639829 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.639851 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:40Z","lastTransitionTime":"2025-12-01T19:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.719745 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:40 crc kubenswrapper[4802]: E1201 19:57:40.719887 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.719999 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.720032 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:40 crc kubenswrapper[4802]: E1201 19:57:40.720273 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:40 crc kubenswrapper[4802]: E1201 19:57:40.720498 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.742581 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.742667 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.742683 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.742697 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.742709 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:40Z","lastTransitionTime":"2025-12-01T19:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.844881 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.844935 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.844951 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.844975 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.844993 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:40Z","lastTransitionTime":"2025-12-01T19:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.947402 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.947470 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.947488 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.947513 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:40 crc kubenswrapper[4802]: I1201 19:57:40.947530 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:40Z","lastTransitionTime":"2025-12-01T19:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.050231 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.050295 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.050321 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.050349 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.050365 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:41Z","lastTransitionTime":"2025-12-01T19:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.153020 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.153069 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.153085 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.153109 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.153125 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:41Z","lastTransitionTime":"2025-12-01T19:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.256283 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.256362 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.256385 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.256416 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.256438 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:41Z","lastTransitionTime":"2025-12-01T19:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.359588 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.359657 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.359668 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.359685 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.359697 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:41Z","lastTransitionTime":"2025-12-01T19:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.462985 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.463395 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.463434 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.463465 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.463488 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:41Z","lastTransitionTime":"2025-12-01T19:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.566802 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.567146 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.567314 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.567456 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.567583 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:41Z","lastTransitionTime":"2025-12-01T19:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.670101 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.670150 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.670162 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.670180 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.670210 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:41Z","lastTransitionTime":"2025-12-01T19:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.719910 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:41 crc kubenswrapper[4802]: E1201 19:57:41.720037 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.772739 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.772777 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.772785 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.772799 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.772808 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:41Z","lastTransitionTime":"2025-12-01T19:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.874947 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.874977 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.874985 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.874998 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.875009 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:41Z","lastTransitionTime":"2025-12-01T19:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.977779 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.977817 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.977825 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.977838 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:41 crc kubenswrapper[4802]: I1201 19:57:41.977848 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:41Z","lastTransitionTime":"2025-12-01T19:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.081399 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.081462 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.081482 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.081507 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.081524 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:42Z","lastTransitionTime":"2025-12-01T19:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.184868 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.184942 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.184977 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.185006 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.185025 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:42Z","lastTransitionTime":"2025-12-01T19:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.199464 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.199506 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.199517 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.199535 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.199547 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:42Z","lastTransitionTime":"2025-12-01T19:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:42 crc kubenswrapper[4802]: E1201 19:57:42.219718 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.225340 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.225396 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.225446 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.225472 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.225489 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:42Z","lastTransitionTime":"2025-12-01T19:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:42 crc kubenswrapper[4802]: E1201 19:57:42.244924 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.250080 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.250160 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.250189 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.250264 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.250290 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:42Z","lastTransitionTime":"2025-12-01T19:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:42 crc kubenswrapper[4802]: E1201 19:57:42.273421 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.279592 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.279657 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.279676 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.279701 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.279718 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:42Z","lastTransitionTime":"2025-12-01T19:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:42 crc kubenswrapper[4802]: E1201 19:57:42.299312 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.304338 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.304397 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.304419 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.304452 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.304475 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:42Z","lastTransitionTime":"2025-12-01T19:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:42 crc kubenswrapper[4802]: E1201 19:57:42.322050 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:42Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:42 crc kubenswrapper[4802]: E1201 19:57:42.322340 4802 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.324413 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.324455 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.324467 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.324485 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.324498 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:42Z","lastTransitionTime":"2025-12-01T19:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.427013 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.427062 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.427072 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.427087 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.427098 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:42Z","lastTransitionTime":"2025-12-01T19:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.529393 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.529462 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.529479 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.529504 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.529521 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:42Z","lastTransitionTime":"2025-12-01T19:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.632066 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.632111 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.632120 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.632134 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.632145 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:42Z","lastTransitionTime":"2025-12-01T19:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.719634 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:42 crc kubenswrapper[4802]: E1201 19:57:42.719767 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.719826 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.719872 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:42 crc kubenswrapper[4802]: E1201 19:57:42.720023 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:42 crc kubenswrapper[4802]: E1201 19:57:42.720141 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.734245 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.734277 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.734287 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.734301 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.734312 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:42Z","lastTransitionTime":"2025-12-01T19:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.836240 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.836315 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.836338 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.836368 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.836389 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:42Z","lastTransitionTime":"2025-12-01T19:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.939485 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.939539 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.939554 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.939573 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:42 crc kubenswrapper[4802]: I1201 19:57:42.939585 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:42Z","lastTransitionTime":"2025-12-01T19:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.042383 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.042457 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.042480 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.042511 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.042532 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:43Z","lastTransitionTime":"2025-12-01T19:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.145973 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.146039 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.146063 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.146093 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.146113 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:43Z","lastTransitionTime":"2025-12-01T19:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.249024 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.249084 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.249104 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.249137 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.249154 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:43Z","lastTransitionTime":"2025-12-01T19:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.351379 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.351444 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.351463 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.351489 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.351507 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:43Z","lastTransitionTime":"2025-12-01T19:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.454187 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.454298 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.454322 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.454353 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.454374 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:43Z","lastTransitionTime":"2025-12-01T19:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.557261 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.557318 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.557335 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.557360 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.557379 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:43Z","lastTransitionTime":"2025-12-01T19:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.660442 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.660541 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.660574 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.660611 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.660635 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:43Z","lastTransitionTime":"2025-12-01T19:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.719839 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:43 crc kubenswrapper[4802]: E1201 19:57:43.720000 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.762927 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.762967 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.762981 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.762995 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.763004 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:43Z","lastTransitionTime":"2025-12-01T19:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.865825 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.865878 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.865893 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.865915 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.865932 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:43Z","lastTransitionTime":"2025-12-01T19:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.969011 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.969067 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.969090 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.969120 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:43 crc kubenswrapper[4802]: I1201 19:57:43.969144 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:43Z","lastTransitionTime":"2025-12-01T19:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.072929 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.073000 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.073018 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.073044 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.073062 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:44Z","lastTransitionTime":"2025-12-01T19:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.176541 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.176618 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.176642 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.176673 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.176699 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:44Z","lastTransitionTime":"2025-12-01T19:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.279370 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.279431 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.279447 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.279469 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.279485 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:44Z","lastTransitionTime":"2025-12-01T19:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.382805 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.382867 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.382880 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.382899 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.382915 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:44Z","lastTransitionTime":"2025-12-01T19:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.485632 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.485689 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.485704 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.485728 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.485745 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:44Z","lastTransitionTime":"2025-12-01T19:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.588401 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.588461 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.588481 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.588505 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.588522 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:44Z","lastTransitionTime":"2025-12-01T19:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.692123 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.692239 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.692264 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.692295 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.692319 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:44Z","lastTransitionTime":"2025-12-01T19:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.719392 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.719522 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:44 crc kubenswrapper[4802]: E1201 19:57:44.719615 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.719670 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:44 crc kubenswrapper[4802]: E1201 19:57:44.719753 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:44 crc kubenswrapper[4802]: E1201 19:57:44.719908 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.795585 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.795629 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.795638 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.795652 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.795663 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:44Z","lastTransitionTime":"2025-12-01T19:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.897856 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.897944 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.897961 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.897979 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:44 crc kubenswrapper[4802]: I1201 19:57:44.897989 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:44Z","lastTransitionTime":"2025-12-01T19:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.001137 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.001244 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.001264 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.001288 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.001304 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:45Z","lastTransitionTime":"2025-12-01T19:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.103762 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.103810 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.103826 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.103845 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.103859 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:45Z","lastTransitionTime":"2025-12-01T19:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.206296 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.206372 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.206443 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.206477 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.206499 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:45Z","lastTransitionTime":"2025-12-01T19:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.309444 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.309481 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.309492 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.309507 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.309519 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:45Z","lastTransitionTime":"2025-12-01T19:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.411872 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.411908 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.411917 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.411930 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.411941 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:45Z","lastTransitionTime":"2025-12-01T19:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.514625 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.514743 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.514816 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.514847 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.514919 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:45Z","lastTransitionTime":"2025-12-01T19:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.617668 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.617708 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.617719 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.617736 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.617754 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:45Z","lastTransitionTime":"2025-12-01T19:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.719081 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:45 crc kubenswrapper[4802]: E1201 19:57:45.719548 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.720834 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.720918 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.720939 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.720962 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.720979 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:45Z","lastTransitionTime":"2025-12-01T19:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.824053 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.824110 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.824128 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.824151 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.824168 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:45Z","lastTransitionTime":"2025-12-01T19:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.926623 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.926655 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.926665 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.926681 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:45 crc kubenswrapper[4802]: I1201 19:57:45.926691 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:45Z","lastTransitionTime":"2025-12-01T19:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.029127 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.029176 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.029188 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.029228 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.029240 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:46Z","lastTransitionTime":"2025-12-01T19:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.131859 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.131896 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.131907 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.131922 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.131932 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:46Z","lastTransitionTime":"2025-12-01T19:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.234742 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.234797 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.234815 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.234840 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.234857 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:46Z","lastTransitionTime":"2025-12-01T19:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.338279 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.338335 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.338353 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.338382 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.338407 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:46Z","lastTransitionTime":"2025-12-01T19:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.440600 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.440645 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.440662 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.440684 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.440703 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:46Z","lastTransitionTime":"2025-12-01T19:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.543339 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.543379 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.543388 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.543402 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.543411 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:46Z","lastTransitionTime":"2025-12-01T19:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.646276 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.646329 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.646342 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.646359 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.646410 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:46Z","lastTransitionTime":"2025-12-01T19:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.719084 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.719184 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:46 crc kubenswrapper[4802]: E1201 19:57:46.719272 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.719331 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:46 crc kubenswrapper[4802]: E1201 19:57:46.719874 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:46 crc kubenswrapper[4802]: E1201 19:57:46.719982 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.749527 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.749603 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.749629 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.749659 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.749682 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:46Z","lastTransitionTime":"2025-12-01T19:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.852482 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.852540 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.852558 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.852582 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.852598 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:46Z","lastTransitionTime":"2025-12-01T19:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.954650 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.954683 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.954692 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.954706 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:46 crc kubenswrapper[4802]: I1201 19:57:46.954714 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:46Z","lastTransitionTime":"2025-12-01T19:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.057998 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.058069 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.058086 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.058159 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.058237 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:47Z","lastTransitionTime":"2025-12-01T19:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.160968 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.161036 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.161052 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.161077 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.161094 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:47Z","lastTransitionTime":"2025-12-01T19:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.264312 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.264405 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.264423 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.264448 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.264465 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:47Z","lastTransitionTime":"2025-12-01T19:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.367695 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.367766 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.367785 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.367809 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.367829 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:47Z","lastTransitionTime":"2025-12-01T19:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.470618 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.470662 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.470698 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.470715 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.470725 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:47Z","lastTransitionTime":"2025-12-01T19:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.573852 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.573913 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.573932 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.573961 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.573980 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:47Z","lastTransitionTime":"2025-12-01T19:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.677722 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.677818 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.677845 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.677875 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.677897 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:47Z","lastTransitionTime":"2025-12-01T19:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.719689 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:47 crc kubenswrapper[4802]: E1201 19:57:47.719878 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.780656 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.780720 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.780737 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.780762 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.780780 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:47Z","lastTransitionTime":"2025-12-01T19:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.884489 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.884607 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.884660 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.884695 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.884720 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:47Z","lastTransitionTime":"2025-12-01T19:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.987931 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.988002 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.988020 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.988045 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:47 crc kubenswrapper[4802]: I1201 19:57:47.988063 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:47Z","lastTransitionTime":"2025-12-01T19:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.090471 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.090533 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.090558 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.090589 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.090612 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:48Z","lastTransitionTime":"2025-12-01T19:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.193340 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.193400 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.193418 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.193440 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.193457 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:48Z","lastTransitionTime":"2025-12-01T19:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.296133 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.296186 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.296218 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.296237 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.296250 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:48Z","lastTransitionTime":"2025-12-01T19:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.399807 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.399867 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.399885 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.399911 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.399928 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:48Z","lastTransitionTime":"2025-12-01T19:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.502632 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.502699 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.502709 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.502741 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.502752 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:48Z","lastTransitionTime":"2025-12-01T19:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.605868 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.605966 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.605988 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.606023 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.606042 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:48Z","lastTransitionTime":"2025-12-01T19:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.710080 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.710155 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.710173 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.710242 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.710262 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:48Z","lastTransitionTime":"2025-12-01T19:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.719548 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.719665 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.719703 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:48 crc kubenswrapper[4802]: E1201 19:57:48.719793 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:48 crc kubenswrapper[4802]: E1201 19:57:48.719930 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:48 crc kubenswrapper[4802]: E1201 19:57:48.720081 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.742467 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.761602 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8602c3d5a6475cdb06fb490a90a96ce1a6cf7bfbf1d3142c5b861de2838e0e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa0e17f47aef6ddde23ea47f070a3015546fd59799ea48e6618eea9f9abce64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.781827 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.800882 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.812469 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.812520 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.812532 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.812545 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.812557 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:48Z","lastTransitionTime":"2025-12-01T19:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.818413 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bca0d5f90fcacce4e41c5c25c3297ad735feaac885581866866378106ae6d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:57:24Z\\\",\\\"message\\\":\\\"2025-12-01T19:56:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ef246b4d-8741-447d-95fc-3e667c36d7f4\\\\n2025-12-01T19:56:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ef246b4d-8741-447d-95fc-3e667c36d7f4 to /host/opt/cni/bin/\\\\n2025-12-01T19:56:39Z [verbose] multus-daemon started\\\\n2025-12-01T19:56:39Z [verbose] Readiness Indicator file check\\\\n2025-12-01T19:57:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.841843 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d2f756b7f258fe70e29cf53852ae3603df9a652b028ad38830a0fd73edabde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6d2f756b7f258fe70e29cf53852ae3603df9a652b028ad38830a0fd73edabde\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"9:57:21.591577 6620 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1201 19:57:21.591574 6620 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-9wvdw after 0 failed attempt(s)\\\\nI1201 19:57:21.591594 6620 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-9wvdw\\\\nI1201 19:57:21.591563 6620 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1201 19:57:21.591603 6620 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1201 19:57:21.591477 6620 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF1201 19:57:21.591608 6620 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:57:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-t7nr2_openshift-ovn-kubernetes(933fb25a-a01a-464e-838a-df1d07bca99e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.860092 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.876746 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.890346 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.907493 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c255f325ac4e39325605a7f8f0f99237e8efa2630bee7931972504b424d7b87a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.916076 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.916121 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.916140 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.916167 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.916185 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:48Z","lastTransitionTime":"2025-12-01T19:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.922629 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc9332ba-a2ce-493d-904b-c4f4d7499d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72a6498825f7c8427eefd172eebe1f4b8dad960f1d474c78c06f901f637ecd3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68fa39b8f5c24877fa42088699094e8bdce24f0679d8c50294f569a771789ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eaeea38dc683a9459f24ab9a0de9c3701ae217e27b425b7eed25530cb7523af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.936148 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.957477 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.972594 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:48 crc kubenswrapper[4802]: I1201 19:57:48.986456 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.000216 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:48Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.014908 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8cs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008be62d-2cef-42a3-912f-2b2e58f8e30b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8cs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:49Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.018829 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.018895 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.018910 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.018930 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.018941 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:49Z","lastTransitionTime":"2025-12-01T19:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.122917 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.123275 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.123463 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.123616 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.123757 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:49Z","lastTransitionTime":"2025-12-01T19:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.227699 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.227771 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.227786 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.227811 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.227826 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:49Z","lastTransitionTime":"2025-12-01T19:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.330275 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.330315 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.330324 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.330342 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.330355 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:49Z","lastTransitionTime":"2025-12-01T19:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.433020 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.433081 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.433099 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.433122 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.433139 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:49Z","lastTransitionTime":"2025-12-01T19:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.537001 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.537093 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.537115 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.537147 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.537171 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:49Z","lastTransitionTime":"2025-12-01T19:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.640311 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.640404 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.640423 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.640446 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.640467 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:49Z","lastTransitionTime":"2025-12-01T19:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.720132 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:49 crc kubenswrapper[4802]: E1201 19:57:49.720628 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.737508 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.743846 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.743903 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.743916 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.743934 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.743947 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:49Z","lastTransitionTime":"2025-12-01T19:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.846516 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.846560 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.846574 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.846592 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.846605 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:49Z","lastTransitionTime":"2025-12-01T19:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.949653 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.949696 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.949705 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.949722 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:49 crc kubenswrapper[4802]: I1201 19:57:49.949731 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:49Z","lastTransitionTime":"2025-12-01T19:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.053617 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.053675 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.053689 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.053711 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.053726 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:50Z","lastTransitionTime":"2025-12-01T19:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.156487 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.156546 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.156564 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.156588 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.156605 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:50Z","lastTransitionTime":"2025-12-01T19:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.259340 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.259423 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.259448 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.259481 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.259506 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:50Z","lastTransitionTime":"2025-12-01T19:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.362699 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.362749 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.362761 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.362779 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.362789 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:50Z","lastTransitionTime":"2025-12-01T19:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.465566 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.465635 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.465658 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.465687 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.465716 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:50Z","lastTransitionTime":"2025-12-01T19:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.568488 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.568577 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.568600 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.568634 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.568659 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:50Z","lastTransitionTime":"2025-12-01T19:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.672734 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.672816 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.672845 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.672880 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.672907 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:50Z","lastTransitionTime":"2025-12-01T19:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.721851 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:50 crc kubenswrapper[4802]: E1201 19:57:50.721985 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.722888 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:50 crc kubenswrapper[4802]: E1201 19:57:50.723006 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.723084 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:50 crc kubenswrapper[4802]: E1201 19:57:50.723193 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:50 crc kubenswrapper[4802]: I1201 19:57:50.735173 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.211456 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.211547 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.211573 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.211611 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.211640 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:51Z","lastTransitionTime":"2025-12-01T19:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.314651 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.314712 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.314728 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.314755 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.314773 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:51Z","lastTransitionTime":"2025-12-01T19:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.417886 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.417930 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.417938 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.417956 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.417971 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:51Z","lastTransitionTime":"2025-12-01T19:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.520650 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.520723 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.520740 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.520765 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.520782 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:51Z","lastTransitionTime":"2025-12-01T19:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.624564 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.624643 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.624669 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.624700 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.624724 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:51Z","lastTransitionTime":"2025-12-01T19:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.719254 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:51 crc kubenswrapper[4802]: E1201 19:57:51.719407 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.720032 4802 scope.go:117] "RemoveContainer" containerID="a6d2f756b7f258fe70e29cf53852ae3603df9a652b028ad38830a0fd73edabde" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.727094 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.727225 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.727298 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.727343 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.727423 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:51Z","lastTransitionTime":"2025-12-01T19:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.830038 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.830098 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.830117 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.830142 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.830156 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:51Z","lastTransitionTime":"2025-12-01T19:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.933891 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.933947 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.933965 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.933988 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:51 crc kubenswrapper[4802]: I1201 19:57:51.934004 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:51Z","lastTransitionTime":"2025-12-01T19:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.036528 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.036589 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.036607 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.036631 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.036650 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:52Z","lastTransitionTime":"2025-12-01T19:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.139521 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.139568 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.139584 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.139607 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.139624 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:52Z","lastTransitionTime":"2025-12-01T19:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.236860 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7nr2_933fb25a-a01a-464e-838a-df1d07bca99e/ovnkube-controller/2.log" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.239755 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" event={"ID":"933fb25a-a01a-464e-838a-df1d07bca99e","Type":"ContainerStarted","Data":"5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d"} Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.240647 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.241186 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.241239 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.241267 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.241280 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.241288 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:52Z","lastTransitionTime":"2025-12-01T19:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.259771 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc9332ba-a2ce-493d-904b-c4f4d7499d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72a6498825f7c8427eefd172eebe1f4b8dad960f1d474c78c06f901f637ecd3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68fa39b8f5c24877fa42088699094e8bdce24f0679d8c50294f569a771789ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eaeea38dc683a9459f24ab9a0de9c3701ae217e27b425b7eed25530cb7523af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.281159 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.296120 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.308522 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.317880 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.329159 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.341549 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8cs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008be62d-2cef-42a3-912f-2b2e58f8e30b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8cs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.343330 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.343384 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.343395 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.343409 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.343418 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:52Z","lastTransitionTime":"2025-12-01T19:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.357267 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.369785 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8602c3d5a6475cdb06fb490a90a96ce1a6cf7bfbf1d3142c5b861de2838e0e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa0e17f47aef6ddde23ea47f070a3015546fd59799ea48e6618eea9f9abce64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.380162 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c230905-38b8-447b-96df-05a7c6ea653f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4442bba04936832a31754ab2a26103c31da700a120cd81b45dcf53c004e9b46a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0247fe32e9c3d59c78a0e9abe0e4597eed112df1efdf3ada9cf424e426b912ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0247fe32e9c3d59c78a0e9abe0e4597eed112df1efdf3ada9cf424e426b912ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.398079 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c281dc-3264-4ba0-bc35-66fa685b5f57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef2d82c0514ca36725246db0e4a9a5c5015f34a0c280689d0d3e0cbcde56b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bfbe6b9c9e36b730ca3b4a6a42c15028518d500a9e0743c0d2dd8626e06f4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c49e47677142d724eb56b940158bc7ff30886cb98911431ef3a3ecd63969ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc45f69bad64d259b95e47b43b39bbcd70edd29f4ef79a96f44b8c65df41c124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d445731c7cee7ec09d220df6418d3b6752b956d188c3d853ac470dfa11037747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab4a806b9002a57bdaaea82b14656b90dc50b57f1e6246fc5d46d6977ca2a6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab4a806b9002a57bdaaea82b14656b90dc50b57f1e6246fc5d46d6977ca2a6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93e59e9bb0da3a1e48726a0c6f1c5f2dcba025d1c1d53a933e0dd7fd4db5581e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93e59e9bb0da3a1e48726a0c6f1c5f2dcba025d1c1d53a933e0dd7fd4db5581e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://10c024cb304aa37cf5334bbf84b35495cedd0e751a199697d050f6804da2516e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10c024cb304aa37cf5334bbf84b35495cedd0e751a199697d050f6804da2516e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.410366 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.422407 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.434888 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bca0d5f90fcacce4e41c5c25c3297ad735feaac885581866866378106ae6d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:57:24Z\\\",\\\"message\\\":\\\"2025-12-01T19:56:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ef246b4d-8741-447d-95fc-3e667c36d7f4\\\\n2025-12-01T19:56:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ef246b4d-8741-447d-95fc-3e667c36d7f4 to /host/opt/cni/bin/\\\\n2025-12-01T19:56:39Z [verbose] multus-daemon started\\\\n2025-12-01T19:56:39Z [verbose] Readiness Indicator file check\\\\n2025-12-01T19:57:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.446015 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.446072 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.446087 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.446105 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.446409 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:52Z","lastTransitionTime":"2025-12-01T19:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.451735 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6d2f756b7f258fe70e29cf53852ae3603df9a652b028ad38830a0fd73edabde\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"9:57:21.591577 6620 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1201 19:57:21.591574 6620 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-9wvdw after 0 failed attempt(s)\\\\nI1201 19:57:21.591594 6620 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-9wvdw\\\\nI1201 19:57:21.591563 6620 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1201 19:57:21.591603 6620 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1201 19:57:21.591477 6620 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF1201 19:57:21.591608 6620 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:57:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.463126 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.475326 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.487766 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.503466 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c255f325ac4e39325605a7f8f0f99237e8efa2630bee7931972504b424d7b87a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.549307 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.549343 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.549352 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.549367 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.549376 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:52Z","lastTransitionTime":"2025-12-01T19:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.557419 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.557464 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.557473 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.557485 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.557492 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:52Z","lastTransitionTime":"2025-12-01T19:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:52 crc kubenswrapper[4802]: E1201 19:57:52.569266 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.572433 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.572469 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.572480 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.572494 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.572503 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:52Z","lastTransitionTime":"2025-12-01T19:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:52 crc kubenswrapper[4802]: E1201 19:57:52.585878 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.589113 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.589158 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.589167 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.589182 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.589209 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:52Z","lastTransitionTime":"2025-12-01T19:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:52 crc kubenswrapper[4802]: E1201 19:57:52.601314 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.604178 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.604228 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.604237 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.604251 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.604261 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:52Z","lastTransitionTime":"2025-12-01T19:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:52 crc kubenswrapper[4802]: E1201 19:57:52.617394 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.620976 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.621014 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.621022 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.621036 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.621046 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:52Z","lastTransitionTime":"2025-12-01T19:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:52 crc kubenswrapper[4802]: E1201 19:57:52.632973 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ccabb53f-70cd-48e6-8bc8-8247c89db90c\\\",\\\"systemUUID\\\":\\\"7ca05b31-e838-4494-a138-5a5047e18b0e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:52Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:52 crc kubenswrapper[4802]: E1201 19:57:52.633073 4802 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.651669 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.651695 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.651703 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.651715 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.651724 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:52Z","lastTransitionTime":"2025-12-01T19:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.719246 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.719316 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:52 crc kubenswrapper[4802]: E1201 19:57:52.719370 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:52 crc kubenswrapper[4802]: E1201 19:57:52.719431 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.719477 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:52 crc kubenswrapper[4802]: E1201 19:57:52.719530 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.753532 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.753578 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.753592 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.753610 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.753623 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:52Z","lastTransitionTime":"2025-12-01T19:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.855863 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.855909 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.855921 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.855939 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.855951 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:52Z","lastTransitionTime":"2025-12-01T19:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.957939 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.957987 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.957999 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.958016 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:52 crc kubenswrapper[4802]: I1201 19:57:52.958030 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:52Z","lastTransitionTime":"2025-12-01T19:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.061139 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.061174 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.061183 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.061217 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.061228 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:53Z","lastTransitionTime":"2025-12-01T19:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.163980 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.164045 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.164061 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.164086 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.164103 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:53Z","lastTransitionTime":"2025-12-01T19:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.245495 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7nr2_933fb25a-a01a-464e-838a-df1d07bca99e/ovnkube-controller/3.log" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.246457 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7nr2_933fb25a-a01a-464e-838a-df1d07bca99e/ovnkube-controller/2.log" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.250556 4802 generic.go:334] "Generic (PLEG): container finished" podID="933fb25a-a01a-464e-838a-df1d07bca99e" containerID="5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d" exitCode=1 Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.250626 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" event={"ID":"933fb25a-a01a-464e-838a-df1d07bca99e","Type":"ContainerDied","Data":"5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d"} Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.250694 4802 scope.go:117] "RemoveContainer" containerID="a6d2f756b7f258fe70e29cf53852ae3603df9a652b028ad38830a0fd73edabde" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.251945 4802 scope.go:117] "RemoveContainer" containerID="5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d" Dec 01 19:57:53 crc kubenswrapper[4802]: E1201 19:57:53.252296 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-t7nr2_openshift-ovn-kubernetes(933fb25a-a01a-464e-838a-df1d07bca99e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.266515 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.266573 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.266590 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.266614 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.266631 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:53Z","lastTransitionTime":"2025-12-01T19:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.269034 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bca0d5f90fcacce4e41c5c25c3297ad735feaac885581866866378106ae6d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:57:24Z\\\",\\\"message\\\":\\\"2025-12-01T19:56:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ef246b4d-8741-447d-95fc-3e667c36d7f4\\\\n2025-12-01T19:56:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ef246b4d-8741-447d-95fc-3e667c36d7f4 to /host/opt/cni/bin/\\\\n2025-12-01T19:56:39Z [verbose] multus-daemon started\\\\n2025-12-01T19:56:39Z [verbose] Readiness Indicator file check\\\\n2025-12-01T19:57:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.298163 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6d2f756b7f258fe70e29cf53852ae3603df9a652b028ad38830a0fd73edabde\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:57:21Z\\\",\\\"message\\\":\\\"9:57:21.591577 6620 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1201 19:57:21.591574 6620 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-9wvdw after 0 failed attempt(s)\\\\nI1201 19:57:21.591594 6620 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-9wvdw\\\\nI1201 19:57:21.591563 6620 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1201 19:57:21.591603 6620 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1201 19:57:21.591477 6620 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF1201 19:57:21.591608 6620 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:57:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"message\\\":\\\"de-identity-vrzqb openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/iptables-alerter-4ln5h openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-image-registry/node-ca-gp8pz openshift-machine-config-operator/machine-config-daemon-tw4xd openshift-machine-config-operator/kube-rbac-proxy-crio-crc]\\\\nI1201 19:57:52.620550 6973 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1201 19:57:52.620568 6973 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI1201 19:57:52.620577 6973 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI1201 19:57:52.620585 6973 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc in node crc\\\\nI1201 19:57:52.620590 6973 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc after 0 failed attempt(s)\\\\nI1201 19:57:52.620595 6973 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nF1201 19:57:52.620594 6973 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.312760 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c230905-38b8-447b-96df-05a7c6ea653f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4442bba04936832a31754ab2a26103c31da700a120cd81b45dcf53c004e9b46a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0247fe32e9c3d59c78a0e9abe0e4597eed112df1efdf3ada9cf424e426b912ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0247fe32e9c3d59c78a0e9abe0e4597eed112df1efdf3ada9cf424e426b912ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.340678 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c281dc-3264-4ba0-bc35-66fa685b5f57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef2d82c0514ca36725246db0e4a9a5c5015f34a0c280689d0d3e0cbcde56b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bfbe6b9c9e36b730ca3b4a6a42c15028518d500a9e0743c0d2dd8626e06f4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c49e47677142d724eb56b940158bc7ff30886cb98911431ef3a3ecd63969ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc45f69bad64d259b95e47b43b39bbcd70edd29f4ef79a96f44b8c65df41c124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d445731c7cee7ec09d220df6418d3b6752b956d188c3d853ac470dfa11037747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab4a806b9002a57bdaaea82b14656b90dc50b57f1e6246fc5d46d6977ca2a6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab4a806b9002a57bdaaea82b14656b90dc50b57f1e6246fc5d46d6977ca2a6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93e59e9bb0da3a1e48726a0c6f1c5f2dcba025d1c1d53a933e0dd7fd4db5581e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93e59e9bb0da3a1e48726a0c6f1c5f2dcba025d1c1d53a933e0dd7fd4db5581e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://10c024cb304aa37cf5334bbf84b35495cedd0e751a199697d050f6804da2516e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10c024cb304aa37cf5334bbf84b35495cedd0e751a199697d050f6804da2516e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.355570 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.369447 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.369484 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.369495 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.369511 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.369522 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:53Z","lastTransitionTime":"2025-12-01T19:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.372087 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.389485 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.405528 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.417446 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.436211 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c255f325ac4e39325605a7f8f0f99237e8efa2630bee7931972504b424d7b87a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.450486 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.466304 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.471899 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.471957 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.471974 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.471998 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.472017 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:53Z","lastTransitionTime":"2025-12-01T19:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.481304 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8cs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008be62d-2cef-42a3-912f-2b2e58f8e30b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8cs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.500513 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc9332ba-a2ce-493d-904b-c4f4d7499d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72a6498825f7c8427eefd172eebe1f4b8dad960f1d474c78c06f901f637ecd3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68fa39b8f5c24877fa42088699094e8bdce24f0679d8c50294f569a771789ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eaeea38dc683a9459f24ab9a0de9c3701ae217e27b425b7eed25530cb7523af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.520406 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.535740 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.550335 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.568450 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.573728 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.573766 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.573778 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.573795 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.573806 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:53Z","lastTransitionTime":"2025-12-01T19:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.582407 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8602c3d5a6475cdb06fb490a90a96ce1a6cf7bfbf1d3142c5b861de2838e0e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa0e17f47aef6ddde23ea47f070a3015546fd59799ea48e6618eea9f9abce64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:53Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.676607 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.676688 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.676711 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.676743 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.676765 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:53Z","lastTransitionTime":"2025-12-01T19:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.719185 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:53 crc kubenswrapper[4802]: E1201 19:57:53.719446 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.779402 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.779457 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.779468 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.779482 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.779493 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:53Z","lastTransitionTime":"2025-12-01T19:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.882128 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.882228 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.882246 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.882269 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.882286 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:53Z","lastTransitionTime":"2025-12-01T19:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.984797 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.984867 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.984885 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.984912 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:53 crc kubenswrapper[4802]: I1201 19:57:53.984928 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:53Z","lastTransitionTime":"2025-12-01T19:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.087598 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.087660 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.087677 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.087704 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.087760 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:54Z","lastTransitionTime":"2025-12-01T19:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.190486 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.190522 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.190530 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.190544 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.190553 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:54Z","lastTransitionTime":"2025-12-01T19:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.255962 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7nr2_933fb25a-a01a-464e-838a-df1d07bca99e/ovnkube-controller/3.log" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.262388 4802 scope.go:117] "RemoveContainer" containerID="5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d" Dec 01 19:57:54 crc kubenswrapper[4802]: E1201 19:57:54.262602 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-t7nr2_openshift-ovn-kubernetes(933fb25a-a01a-464e-838a-df1d07bca99e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.274842 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:54Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.287413 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8602c3d5a6475cdb06fb490a90a96ce1a6cf7bfbf1d3142c5b861de2838e0e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa0e17f47aef6ddde23ea47f070a3015546fd59799ea48e6618eea9f9abce64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:54Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.292600 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.292643 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.292654 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.292669 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.292681 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:54Z","lastTransitionTime":"2025-12-01T19:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.302466 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:54Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.317719 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:54Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.346759 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bca0d5f90fcacce4e41c5c25c3297ad735feaac885581866866378106ae6d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:57:24Z\\\",\\\"message\\\":\\\"2025-12-01T19:56:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ef246b4d-8741-447d-95fc-3e667c36d7f4\\\\n2025-12-01T19:56:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ef246b4d-8741-447d-95fc-3e667c36d7f4 to /host/opt/cni/bin/\\\\n2025-12-01T19:56:39Z [verbose] multus-daemon started\\\\n2025-12-01T19:56:39Z [verbose] Readiness Indicator file check\\\\n2025-12-01T19:57:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:54Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.393072 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"message\\\":\\\"de-identity-vrzqb openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/iptables-alerter-4ln5h openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-image-registry/node-ca-gp8pz openshift-machine-config-operator/machine-config-daemon-tw4xd openshift-machine-config-operator/kube-rbac-proxy-crio-crc]\\\\nI1201 19:57:52.620550 6973 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1201 19:57:52.620568 6973 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI1201 19:57:52.620577 6973 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI1201 19:57:52.620585 6973 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc in node crc\\\\nI1201 19:57:52.620590 6973 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc after 0 failed attempt(s)\\\\nI1201 19:57:52.620595 6973 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nF1201 19:57:52.620594 6973 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:57:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-t7nr2_openshift-ovn-kubernetes(933fb25a-a01a-464e-838a-df1d07bca99e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:54Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.394296 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.394329 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.394339 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.394356 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.394365 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:54Z","lastTransitionTime":"2025-12-01T19:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.403496 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c230905-38b8-447b-96df-05a7c6ea653f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4442bba04936832a31754ab2a26103c31da700a120cd81b45dcf53c004e9b46a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0247fe32e9c3d59c78a0e9abe0e4597eed112df1efdf3ada9cf424e426b912ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0247fe32e9c3d59c78a0e9abe0e4597eed112df1efdf3ada9cf424e426b912ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:54Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.421400 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c281dc-3264-4ba0-bc35-66fa685b5f57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef2d82c0514ca36725246db0e4a9a5c5015f34a0c280689d0d3e0cbcde56b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bfbe6b9c9e36b730ca3b4a6a42c15028518d500a9e0743c0d2dd8626e06f4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c49e47677142d724eb56b940158bc7ff30886cb98911431ef3a3ecd63969ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc45f69bad64d259b95e47b43b39bbcd70edd29f4ef79a96f44b8c65df41c124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d445731c7cee7ec09d220df6418d3b6752b956d188c3d853ac470dfa11037747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab4a806b9002a57bdaaea82b14656b90dc50b57f1e6246fc5d46d6977ca2a6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab4a806b9002a57bdaaea82b14656b90dc50b57f1e6246fc5d46d6977ca2a6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93e59e9bb0da3a1e48726a0c6f1c5f2dcba025d1c1d53a933e0dd7fd4db5581e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93e59e9bb0da3a1e48726a0c6f1c5f2dcba025d1c1d53a933e0dd7fd4db5581e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://10c024cb304aa37cf5334bbf84b35495cedd0e751a199697d050f6804da2516e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10c024cb304aa37cf5334bbf84b35495cedd0e751a199697d050f6804da2516e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:54Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.430374 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:54Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.442601 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c255f325ac4e39325605a7f8f0f99237e8efa2630bee7931972504b424d7b87a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:54Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.454625 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:54Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.464307 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:54Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.474727 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:54Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.485947 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:54Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.494696 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:54Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.496100 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.496129 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.496142 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.496159 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.496170 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:54Z","lastTransitionTime":"2025-12-01T19:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.506246 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:54Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.516438 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8cs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008be62d-2cef-42a3-912f-2b2e58f8e30b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8cs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:54Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.526544 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc9332ba-a2ce-493d-904b-c4f4d7499d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72a6498825f7c8427eefd172eebe1f4b8dad960f1d474c78c06f901f637ecd3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68fa39b8f5c24877fa42088699094e8bdce24f0679d8c50294f569a771789ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eaeea38dc683a9459f24ab9a0de9c3701ae217e27b425b7eed25530cb7523af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:54Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.537276 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:54Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.599092 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.599148 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.599159 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.599175 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.599184 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:54Z","lastTransitionTime":"2025-12-01T19:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.701566 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.701626 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.701643 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.701666 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.701685 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:54Z","lastTransitionTime":"2025-12-01T19:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.719646 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.719723 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.719648 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:54 crc kubenswrapper[4802]: E1201 19:57:54.719816 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:54 crc kubenswrapper[4802]: E1201 19:57:54.719909 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:54 crc kubenswrapper[4802]: E1201 19:57:54.719979 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.804402 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.804463 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.804474 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.804488 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.804496 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:54Z","lastTransitionTime":"2025-12-01T19:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.907028 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.907473 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.907647 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.907796 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:54 crc kubenswrapper[4802]: I1201 19:57:54.907919 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:54Z","lastTransitionTime":"2025-12-01T19:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.011057 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.011350 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.011468 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.011600 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.011682 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:55Z","lastTransitionTime":"2025-12-01T19:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.113777 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.113817 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.113829 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.113847 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.113860 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:55Z","lastTransitionTime":"2025-12-01T19:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.216113 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.216186 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.216265 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.216296 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.216345 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:55Z","lastTransitionTime":"2025-12-01T19:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.319183 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.319285 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.319303 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.319330 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.319348 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:55Z","lastTransitionTime":"2025-12-01T19:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.421420 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.421519 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.421539 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.421565 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.421581 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:55Z","lastTransitionTime":"2025-12-01T19:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.524018 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.524089 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.524111 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.524143 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.524169 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:55Z","lastTransitionTime":"2025-12-01T19:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.626750 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.626820 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.626844 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.626879 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.626904 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:55Z","lastTransitionTime":"2025-12-01T19:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.719468 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:55 crc kubenswrapper[4802]: E1201 19:57:55.719891 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.729056 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.729136 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.729157 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.729182 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.729236 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:55Z","lastTransitionTime":"2025-12-01T19:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.832554 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.832612 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.832631 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.832696 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.832714 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:55Z","lastTransitionTime":"2025-12-01T19:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.935231 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.935297 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.935314 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.935338 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:55 crc kubenswrapper[4802]: I1201 19:57:55.935355 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:55Z","lastTransitionTime":"2025-12-01T19:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.038635 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.038677 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.038685 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.038700 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.038709 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:56Z","lastTransitionTime":"2025-12-01T19:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.141678 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.141713 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.141721 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.141733 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.141743 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:56Z","lastTransitionTime":"2025-12-01T19:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.244134 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.244250 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.244277 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.244310 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.244338 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:56Z","lastTransitionTime":"2025-12-01T19:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.346745 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.346790 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.346861 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.346892 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.346911 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:56Z","lastTransitionTime":"2025-12-01T19:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.453837 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.453913 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.453932 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.454133 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.454168 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:56Z","lastTransitionTime":"2025-12-01T19:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.559452 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.559550 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.559572 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.559603 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.559623 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:56Z","lastTransitionTime":"2025-12-01T19:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.663456 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.663526 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.663546 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.663576 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.663593 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:56Z","lastTransitionTime":"2025-12-01T19:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.719819 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.719910 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.719995 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:56 crc kubenswrapper[4802]: E1201 19:57:56.720102 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:56 crc kubenswrapper[4802]: E1201 19:57:56.720016 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:56 crc kubenswrapper[4802]: E1201 19:57:56.720358 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.766474 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.766531 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.766543 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.766563 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.766580 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:56Z","lastTransitionTime":"2025-12-01T19:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.771167 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008be62d-2cef-42a3-912f-2b2e58f8e30b-metrics-certs\") pod \"network-metrics-daemon-p8cs7\" (UID: \"008be62d-2cef-42a3-912f-2b2e58f8e30b\") " pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:56 crc kubenswrapper[4802]: E1201 19:57:56.771406 4802 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 19:57:56 crc kubenswrapper[4802]: E1201 19:57:56.771527 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/008be62d-2cef-42a3-912f-2b2e58f8e30b-metrics-certs podName:008be62d-2cef-42a3-912f-2b2e58f8e30b nodeName:}" failed. No retries permitted until 2025-12-01 19:59:00.771496413 +0000 UTC m=+162.334056094 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/008be62d-2cef-42a3-912f-2b2e58f8e30b-metrics-certs") pod "network-metrics-daemon-p8cs7" (UID: "008be62d-2cef-42a3-912f-2b2e58f8e30b") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.869301 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.869337 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.869346 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.869363 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.869374 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:56Z","lastTransitionTime":"2025-12-01T19:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.972157 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.972211 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.972220 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.972234 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:56 crc kubenswrapper[4802]: I1201 19:57:56.972243 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:56Z","lastTransitionTime":"2025-12-01T19:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.075917 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.075987 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.076006 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.076033 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.076051 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:57Z","lastTransitionTime":"2025-12-01T19:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.178855 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.178918 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.178936 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.178965 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.178984 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:57Z","lastTransitionTime":"2025-12-01T19:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.281712 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.281766 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.281794 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.281820 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.281840 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:57Z","lastTransitionTime":"2025-12-01T19:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.384358 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.384441 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.384459 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.384484 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.384501 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:57Z","lastTransitionTime":"2025-12-01T19:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.487419 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.487487 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.487522 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.487549 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.487566 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:57Z","lastTransitionTime":"2025-12-01T19:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.591229 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.591303 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.591321 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.591346 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.591367 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:57Z","lastTransitionTime":"2025-12-01T19:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.694817 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.694879 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.694896 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.694919 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.694938 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:57Z","lastTransitionTime":"2025-12-01T19:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.720282 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:57 crc kubenswrapper[4802]: E1201 19:57:57.720581 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.798691 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.798823 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.798841 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.798866 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.798884 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:57Z","lastTransitionTime":"2025-12-01T19:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.901850 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.901910 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.901927 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.901953 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:57 crc kubenswrapper[4802]: I1201 19:57:57.901975 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:57Z","lastTransitionTime":"2025-12-01T19:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.004760 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.004986 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.005042 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.005077 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.005102 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:58Z","lastTransitionTime":"2025-12-01T19:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.108354 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.108430 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.108454 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.108486 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.108511 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:58Z","lastTransitionTime":"2025-12-01T19:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.211471 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.211519 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.211534 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.211551 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.211562 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:58Z","lastTransitionTime":"2025-12-01T19:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.313584 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.313630 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.313641 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.313662 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.313673 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:58Z","lastTransitionTime":"2025-12-01T19:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.416840 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.416902 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.416919 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.416943 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.416962 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:58Z","lastTransitionTime":"2025-12-01T19:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.520409 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.520488 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.520512 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.520538 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.520557 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:58Z","lastTransitionTime":"2025-12-01T19:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.623762 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.623826 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.623844 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.623868 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.623884 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:58Z","lastTransitionTime":"2025-12-01T19:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.720493 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:57:58 crc kubenswrapper[4802]: E1201 19:57:58.720692 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.720744 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.720800 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:57:58 crc kubenswrapper[4802]: E1201 19:57:58.720908 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:57:58 crc kubenswrapper[4802]: E1201 19:57:58.721082 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.728372 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.728436 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.728459 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.728498 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.728522 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:58Z","lastTransitionTime":"2025-12-01T19:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.739993 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gp8pz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"22a5a396-7caa-46ff-8456-3f6eb84db887\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8bdd4fd3b06f028adfbaa21dfb0e0d3270eecbe871243f2bbf6e63278b4f6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sklb8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:43Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gp8pz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.755545 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8cs7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"008be62d-2cef-42a3-912f-2b2e58f8e30b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n95sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8cs7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.774190 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc9332ba-a2ce-493d-904b-c4f4d7499d43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72a6498825f7c8427eefd172eebe1f4b8dad960f1d474c78c06f901f637ecd3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68fa39b8f5c24877fa42088699094e8bdce24f0679d8c50294f569a771789ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eaeea38dc683a9459f24ab9a0de9c3701ae217e27b425b7eed25530cb7523af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e9fd4def4f9f94c44bbbabe6fb4dbad01ed3e03231e86d4c1ff74d1f637ff9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.796367 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.809975 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.823087 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72ed0a1a33b0e870dd4f111ad9590b77d5f542e713f3abe66642d1d27dc53b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.830880 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.830951 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.830970 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.830996 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.831013 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:58Z","lastTransitionTime":"2025-12-01T19:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.840037 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9wvdw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c338aa9-4647-4436-aaf2-d7b1d85b9219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccdeff7df79866286b969522a5570d60f159848d0d02f392258139649acd30e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7p9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9wvdw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.857561 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75cce5875fc1d0ec1cca8a9968a7f4a4060e2b740ebdd493d8f96394ba714385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.870534 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5dd3f54-4b2a-4ae6-9cce-d5ac0e044b0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8602c3d5a6475cdb06fb490a90a96ce1a6cf7bfbf1d3142c5b861de2838e0e8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaa0e17f47aef6ddde23ea47f070a3015546fd59799ea48e6618eea9f9abce64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2d4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cknrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.901254 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"933fb25a-a01a-464e-838a-df1d07bca99e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:57:52Z\\\",\\\"message\\\":\\\"de-identity-vrzqb openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/iptables-alerter-4ln5h openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-image-registry/node-ca-gp8pz openshift-machine-config-operator/machine-config-daemon-tw4xd openshift-machine-config-operator/kube-rbac-proxy-crio-crc]\\\\nI1201 19:57:52.620550 6973 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1201 19:57:52.620568 6973 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI1201 19:57:52.620577 6973 obj_retry.go:365] Adding new object: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI1201 19:57:52.620585 6973 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc in node crc\\\\nI1201 19:57:52.620590 6973 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc after 0 failed attempt(s)\\\\nI1201 19:57:52.620595 6973 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nF1201 19:57:52.620594 6973 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:57:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-t7nr2_openshift-ovn-kubernetes(933fb25a-a01a-464e-838a-df1d07bca99e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d9xgd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-t7nr2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.915701 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c230905-38b8-447b-96df-05a7c6ea653f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4442bba04936832a31754ab2a26103c31da700a120cd81b45dcf53c004e9b46a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0247fe32e9c3d59c78a0e9abe0e4597eed112df1efdf3ada9cf424e426b912ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0247fe32e9c3d59c78a0e9abe0e4597eed112df1efdf3ada9cf424e426b912ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.934560 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.934633 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.934659 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.934684 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.934702 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:58Z","lastTransitionTime":"2025-12-01T19:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.940861 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c281dc-3264-4ba0-bc35-66fa685b5f57\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cef2d82c0514ca36725246db0e4a9a5c5015f34a0c280689d0d3e0cbcde56b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bfbe6b9c9e36b730ca3b4a6a42c15028518d500a9e0743c0d2dd8626e06f4b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c49e47677142d724eb56b940158bc7ff30886cb98911431ef3a3ecd63969ea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc45f69bad64d259b95e47b43b39bbcd70edd29f4ef79a96f44b8c65df41c124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d445731c7cee7ec09d220df6418d3b6752b956d188c3d853ac470dfa11037747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab4a806b9002a57bdaaea82b14656b90dc50b57f1e6246fc5d46d6977ca2a6dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab4a806b9002a57bdaaea82b14656b90dc50b57f1e6246fc5d46d6977ca2a6dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93e59e9bb0da3a1e48726a0c6f1c5f2dcba025d1c1d53a933e0dd7fd4db5581e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93e59e9bb0da3a1e48726a0c6f1c5f2dcba025d1c1d53a933e0dd7fd4db5581e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://10c024cb304aa37cf5334bbf84b35495cedd0e751a199697d050f6804da2516e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10c024cb304aa37cf5334bbf84b35495cedd0e751a199697d050f6804da2516e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.960673 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ecf62ad-505a-412f-9773-217b6f1a855c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9df831f2575bee4e19b084324d7c929195726e620a58bd95fff4188a34abb954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27a5c3dfd2eeaba2e336d870981ad4a42072d799c3ed0d8afd9c77f452da9dff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39bd491e3d23ffddd1e6fa66b799668ed6a21023377288b94971f482a7866798\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.980114 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5d8dab782ea2f0da941c180eef87f1c00e7f7098b6a06a483c1a0b639311214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84c2a579493816002f256a75dab5f8451e84601cded4b65ec1abf0bac716683a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:58 crc kubenswrapper[4802]: I1201 19:57:58.999004 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8zl28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd82ca15-4489-4c15-aaf0-afb6b6787dc6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:57:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bca0d5f90fcacce4e41c5c25c3297ad735feaac885581866866378106ae6d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T19:57:24Z\\\",\\\"message\\\":\\\"2025-12-01T19:56:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ef246b4d-8741-447d-95fc-3e667c36d7f4\\\\n2025-12-01T19:56:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ef246b4d-8741-447d-95fc-3e667c36d7f4 to /host/opt/cni/bin/\\\\n2025-12-01T19:56:39Z [verbose] multus-daemon started\\\\n2025-12-01T19:56:39Z [verbose] Readiness Indicator file check\\\\n2025-12-01T19:57:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g58lj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8zl28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:58Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.018388 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764618980\\\\\\\\\\\\\\\" (2025-12-01 19:56:20 +0000 UTC to 2025-12-31 19:56:21 +0000 UTC (now=2025-12-01 19:56:36.419415154 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419610 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764618991\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764618991\\\\\\\\\\\\\\\" (2025-12-01 18:56:31 +0000 UTC to 2026-12-01 18:56:31 +0000 UTC (now=2025-12-01 19:56:36.419586749 +0000 UTC))\\\\\\\"\\\\nI1201 19:56:36.419635 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1201 19:56:36.419665 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1201 19:56:36.419788 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419806 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3464231638/tls.crt::/tmp/serving-cert-3464231638/tls.key\\\\\\\"\\\\nI1201 19:56:36.419816 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1201 19:56:36.419830 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1201 19:56:36.419931 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1201 19:56:36.419952 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1201 19:56:36.419972 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nF1201 19:56:36.419974 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:59Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.035596 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:59Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.037399 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.037445 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.037458 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.037476 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.037488 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:59Z","lastTransitionTime":"2025-12-01T19:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.048777 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23e1ef99-f507-42ea-a076-4fc1681c7e8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5fbfc440cfdf89fa370c6b4fa31819ea722ced9e6fa76dd33662fdaaba7b6a3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hch6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tw4xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:59Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.068285 4802 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-htfwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fb49bb8-3d0a-4ff5-80bf-60c34f310345\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T19:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c255f325ac4e39325605a7f8f0f99237e8efa2630bee7931972504b424d7b87a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T19:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad61435616386c858bd2e641816ecf083e4103bc32a15cb4c4c5cdfea98c9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48fc1f9825a5ded522d70b9bdf06900382e4c1740467ea6854cda8cfdc2d9f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d81d56c071a46d5b42d40220ef16709c39f3dc5e81798ca0f40be3a925895720\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a32f27c8ef5c9af035938b055ba94a1a2b98472eabc34114c3d4f0d7eefea202\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4575cfa46ed39883a56d92fd98c7da5c44dc176e66657b1fe46f0302bdfd2cca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cbc7bd018f4e22c644facddf8dc6a560935b655a22fd91fb23b42fa2cce433c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T19:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T19:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8zcvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T19:56:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-htfwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T19:57:59Z is after 2025-08-24T17:21:41Z" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.140833 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.140877 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.140896 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.140923 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.140940 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:59Z","lastTransitionTime":"2025-12-01T19:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.243950 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.244014 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.244039 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.244070 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.244094 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:59Z","lastTransitionTime":"2025-12-01T19:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.346407 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.346468 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.346492 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.346553 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.346576 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:59Z","lastTransitionTime":"2025-12-01T19:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.448803 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.448869 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.448893 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.448924 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.448947 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:59Z","lastTransitionTime":"2025-12-01T19:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.551447 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.551482 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.551490 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.551520 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.551530 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:59Z","lastTransitionTime":"2025-12-01T19:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.653740 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.653788 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.653806 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.653839 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.653861 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:59Z","lastTransitionTime":"2025-12-01T19:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.719814 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:57:59 crc kubenswrapper[4802]: E1201 19:57:59.719944 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.757250 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.757365 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.757382 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.757404 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.757421 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:59Z","lastTransitionTime":"2025-12-01T19:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.859600 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.859658 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.859675 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.859697 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.859714 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:59Z","lastTransitionTime":"2025-12-01T19:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.962065 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.962124 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.962141 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.962166 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:57:59 crc kubenswrapper[4802]: I1201 19:57:59.962182 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:57:59Z","lastTransitionTime":"2025-12-01T19:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.065012 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.065091 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.065117 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.065149 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.065174 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:58:00Z","lastTransitionTime":"2025-12-01T19:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.168036 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.168099 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.168113 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.168132 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.168142 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:58:00Z","lastTransitionTime":"2025-12-01T19:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.270754 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.270813 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.270832 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.270856 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.270873 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:58:00Z","lastTransitionTime":"2025-12-01T19:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.373668 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.373731 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.373749 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.373774 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.373792 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:58:00Z","lastTransitionTime":"2025-12-01T19:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.477052 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.477125 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.477143 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.477172 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.477231 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:58:00Z","lastTransitionTime":"2025-12-01T19:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.580042 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.580170 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.580192 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.580252 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.580270 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:58:00Z","lastTransitionTime":"2025-12-01T19:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.683107 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.683140 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.683148 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.683161 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.683170 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:58:00Z","lastTransitionTime":"2025-12-01T19:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.719460 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.719523 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.719461 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:58:00 crc kubenswrapper[4802]: E1201 19:58:00.719622 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:58:00 crc kubenswrapper[4802]: E1201 19:58:00.719769 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:58:00 crc kubenswrapper[4802]: E1201 19:58:00.719870 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.785586 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.785644 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.785655 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.785670 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.785679 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:58:00Z","lastTransitionTime":"2025-12-01T19:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.887612 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.887687 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.887703 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.887730 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.887749 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:58:00Z","lastTransitionTime":"2025-12-01T19:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.990537 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.990593 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.990605 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.990622 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:58:00 crc kubenswrapper[4802]: I1201 19:58:00.990634 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:58:00Z","lastTransitionTime":"2025-12-01T19:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.093288 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.093358 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.093377 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.093402 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.093422 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:58:01Z","lastTransitionTime":"2025-12-01T19:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.196421 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.196472 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.196490 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.196510 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.196525 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:58:01Z","lastTransitionTime":"2025-12-01T19:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.301640 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.301691 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.301709 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.301731 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.301748 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:58:01Z","lastTransitionTime":"2025-12-01T19:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.404083 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.404367 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.404489 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.404628 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.404729 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:58:01Z","lastTransitionTime":"2025-12-01T19:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.507278 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.507332 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.507348 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.507372 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.507391 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:58:01Z","lastTransitionTime":"2025-12-01T19:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.610120 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.610223 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.610248 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.610277 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.610298 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:58:01Z","lastTransitionTime":"2025-12-01T19:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.712808 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.712844 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.712853 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.712866 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.712874 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:58:01Z","lastTransitionTime":"2025-12-01T19:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.719639 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:58:01 crc kubenswrapper[4802]: E1201 19:58:01.720044 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.816547 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.816609 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.816626 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.816650 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.816667 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:58:01Z","lastTransitionTime":"2025-12-01T19:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.919148 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.919253 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.919277 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.919305 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:58:01 crc kubenswrapper[4802]: I1201 19:58:01.919326 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:58:01Z","lastTransitionTime":"2025-12-01T19:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.022877 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.022969 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.022995 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.023027 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.023048 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:58:02Z","lastTransitionTime":"2025-12-01T19:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.126419 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.126888 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.127142 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.127393 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.127579 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:58:02Z","lastTransitionTime":"2025-12-01T19:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.231387 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.231683 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.231870 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.232020 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.232246 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:58:02Z","lastTransitionTime":"2025-12-01T19:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.335568 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.335638 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.335652 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.335669 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.335741 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:58:02Z","lastTransitionTime":"2025-12-01T19:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.438563 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.438622 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.438640 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.438665 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.438685 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:58:02Z","lastTransitionTime":"2025-12-01T19:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.541468 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.541520 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.541541 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.541565 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.541583 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:58:02Z","lastTransitionTime":"2025-12-01T19:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.645036 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.645113 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.645133 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.645157 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.645174 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:58:02Z","lastTransitionTime":"2025-12-01T19:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.658188 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.658298 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.658316 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.658342 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.658362 4802 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T19:58:02Z","lastTransitionTime":"2025-12-01T19:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.719601 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.719667 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.719690 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:58:02 crc kubenswrapper[4802]: E1201 19:58:02.719825 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:58:02 crc kubenswrapper[4802]: E1201 19:58:02.719919 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:58:02 crc kubenswrapper[4802]: E1201 19:58:02.720127 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.730493 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-mlkd5"] Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.730973 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mlkd5" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.733823 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.734253 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.734315 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.735845 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.798664 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9wvdw" podStartSLOduration=85.798633748 podStartE2EDuration="1m25.798633748s" podCreationTimestamp="2025-12-01 19:56:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:02.798594247 +0000 UTC m=+104.361153948" watchObservedRunningTime="2025-12-01 19:58:02.798633748 +0000 UTC m=+104.361193429" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.829003 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-gp8pz" podStartSLOduration=84.828974228 podStartE2EDuration="1m24.828974228s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:02.813301573 +0000 UTC m=+104.375861284" watchObservedRunningTime="2025-12-01 19:58:02.828974228 +0000 UTC m=+104.391533909" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.841602 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b786c4b3-9c96-4785-ab3a-2d69f989bd10-service-ca\") pod \"cluster-version-operator-5c965bbfc6-mlkd5\" (UID: \"b786c4b3-9c96-4785-ab3a-2d69f989bd10\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mlkd5" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.841729 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b786c4b3-9c96-4785-ab3a-2d69f989bd10-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-mlkd5\" (UID: \"b786c4b3-9c96-4785-ab3a-2d69f989bd10\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mlkd5" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.841795 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b786c4b3-9c96-4785-ab3a-2d69f989bd10-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-mlkd5\" (UID: \"b786c4b3-9c96-4785-ab3a-2d69f989bd10\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mlkd5" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.841884 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b786c4b3-9c96-4785-ab3a-2d69f989bd10-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-mlkd5\" (UID: \"b786c4b3-9c96-4785-ab3a-2d69f989bd10\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mlkd5" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.841937 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b786c4b3-9c96-4785-ab3a-2d69f989bd10-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-mlkd5\" (UID: \"b786c4b3-9c96-4785-ab3a-2d69f989bd10\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mlkd5" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.848471 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=55.84844154 podStartE2EDuration="55.84844154s" podCreationTimestamp="2025-12-01 19:57:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:02.846635255 +0000 UTC m=+104.409194936" watchObservedRunningTime="2025-12-01 19:58:02.84844154 +0000 UTC m=+104.411001231" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.928105 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cknrk" podStartSLOduration=83.928083017 podStartE2EDuration="1m23.928083017s" podCreationTimestamp="2025-12-01 19:56:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:02.927998494 +0000 UTC m=+104.490558165" watchObservedRunningTime="2025-12-01 19:58:02.928083017 +0000 UTC m=+104.490642668" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.942640 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b786c4b3-9c96-4785-ab3a-2d69f989bd10-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-mlkd5\" (UID: \"b786c4b3-9c96-4785-ab3a-2d69f989bd10\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mlkd5" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.942721 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b786c4b3-9c96-4785-ab3a-2d69f989bd10-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-mlkd5\" (UID: \"b786c4b3-9c96-4785-ab3a-2d69f989bd10\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mlkd5" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.942754 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b786c4b3-9c96-4785-ab3a-2d69f989bd10-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-mlkd5\" (UID: \"b786c4b3-9c96-4785-ab3a-2d69f989bd10\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mlkd5" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.942816 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b786c4b3-9c96-4785-ab3a-2d69f989bd10-service-ca\") pod \"cluster-version-operator-5c965bbfc6-mlkd5\" (UID: \"b786c4b3-9c96-4785-ab3a-2d69f989bd10\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mlkd5" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.942860 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b786c4b3-9c96-4785-ab3a-2d69f989bd10-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-mlkd5\" (UID: \"b786c4b3-9c96-4785-ab3a-2d69f989bd10\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mlkd5" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.942887 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b786c4b3-9c96-4785-ab3a-2d69f989bd10-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-mlkd5\" (UID: \"b786c4b3-9c96-4785-ab3a-2d69f989bd10\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mlkd5" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.942959 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b786c4b3-9c96-4785-ab3a-2d69f989bd10-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-mlkd5\" (UID: \"b786c4b3-9c96-4785-ab3a-2d69f989bd10\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mlkd5" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.943889 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b786c4b3-9c96-4785-ab3a-2d69f989bd10-service-ca\") pod \"cluster-version-operator-5c965bbfc6-mlkd5\" (UID: \"b786c4b3-9c96-4785-ab3a-2d69f989bd10\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mlkd5" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.950225 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b786c4b3-9c96-4785-ab3a-2d69f989bd10-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-mlkd5\" (UID: \"b786c4b3-9c96-4785-ab3a-2d69f989bd10\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mlkd5" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.961956 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b786c4b3-9c96-4785-ab3a-2d69f989bd10-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-mlkd5\" (UID: \"b786c4b3-9c96-4785-ab3a-2d69f989bd10\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mlkd5" Dec 01 19:58:02 crc kubenswrapper[4802]: I1201 19:58:02.970680 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=86.970652595 podStartE2EDuration="1m26.970652595s" podCreationTimestamp="2025-12-01 19:56:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:02.950693937 +0000 UTC m=+104.513253588" watchObservedRunningTime="2025-12-01 19:58:02.970652595 +0000 UTC m=+104.533212256" Dec 01 19:58:03 crc kubenswrapper[4802]: I1201 19:58:03.022759 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8zl28" podStartSLOduration=85.022733787 podStartE2EDuration="1m25.022733787s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:02.992586844 +0000 UTC m=+104.555146525" watchObservedRunningTime="2025-12-01 19:58:03.022733787 +0000 UTC m=+104.585293458" Dec 01 19:58:03 crc kubenswrapper[4802]: I1201 19:58:03.034711 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=14.034693448 podStartE2EDuration="14.034693448s" podCreationTimestamp="2025-12-01 19:57:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:03.034240234 +0000 UTC m=+104.596799885" watchObservedRunningTime="2025-12-01 19:58:03.034693448 +0000 UTC m=+104.597253099" Dec 01 19:58:03 crc kubenswrapper[4802]: I1201 19:58:03.044876 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mlkd5" Dec 01 19:58:03 crc kubenswrapper[4802]: I1201 19:58:03.057753 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=13.057736281 podStartE2EDuration="13.057736281s" podCreationTimestamp="2025-12-01 19:57:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:03.056545294 +0000 UTC m=+104.619104965" watchObservedRunningTime="2025-12-01 19:58:03.057736281 +0000 UTC m=+104.620295922" Dec 01 19:58:03 crc kubenswrapper[4802]: I1201 19:58:03.069644 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podStartSLOduration=86.069621839 podStartE2EDuration="1m26.069621839s" podCreationTimestamp="2025-12-01 19:56:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:03.0689904 +0000 UTC m=+104.631550051" watchObservedRunningTime="2025-12-01 19:58:03.069621839 +0000 UTC m=+104.632181490" Dec 01 19:58:03 crc kubenswrapper[4802]: I1201 19:58:03.104301 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-htfwc" podStartSLOduration=85.104280982 podStartE2EDuration="1m25.104280982s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:03.087343948 +0000 UTC m=+104.649903629" watchObservedRunningTime="2025-12-01 19:58:03.104280982 +0000 UTC m=+104.666840633" Dec 01 19:58:03 crc kubenswrapper[4802]: I1201 19:58:03.104648 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=87.104641184 podStartE2EDuration="1m27.104641184s" podCreationTimestamp="2025-12-01 19:56:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:03.103143307 +0000 UTC m=+104.665702958" watchObservedRunningTime="2025-12-01 19:58:03.104641184 +0000 UTC m=+104.667200835" Dec 01 19:58:03 crc kubenswrapper[4802]: I1201 19:58:03.297515 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mlkd5" event={"ID":"b786c4b3-9c96-4785-ab3a-2d69f989bd10","Type":"ContainerStarted","Data":"0bccd724242e499441c75e5f810a3fb3a12d7b6c9cf875121e641631f683fc7f"} Dec 01 19:58:03 crc kubenswrapper[4802]: I1201 19:58:03.297572 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mlkd5" event={"ID":"b786c4b3-9c96-4785-ab3a-2d69f989bd10","Type":"ContainerStarted","Data":"c4fb9a8bd72645f4ee85a591739503294d843214e3ba1514e658a81d5b661b47"} Dec 01 19:58:03 crc kubenswrapper[4802]: I1201 19:58:03.311362 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-mlkd5" podStartSLOduration=85.311334633 podStartE2EDuration="1m25.311334633s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:03.310702703 +0000 UTC m=+104.873262344" watchObservedRunningTime="2025-12-01 19:58:03.311334633 +0000 UTC m=+104.873894314" Dec 01 19:58:03 crc kubenswrapper[4802]: I1201 19:58:03.719077 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:58:03 crc kubenswrapper[4802]: E1201 19:58:03.719359 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:58:04 crc kubenswrapper[4802]: I1201 19:58:04.719512 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:58:04 crc kubenswrapper[4802]: I1201 19:58:04.719673 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:58:04 crc kubenswrapper[4802]: I1201 19:58:04.719788 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:58:04 crc kubenswrapper[4802]: E1201 19:58:04.719899 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:58:04 crc kubenswrapper[4802]: E1201 19:58:04.719973 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:58:04 crc kubenswrapper[4802]: E1201 19:58:04.720057 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:58:05 crc kubenswrapper[4802]: I1201 19:58:05.719364 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:58:05 crc kubenswrapper[4802]: E1201 19:58:05.719879 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:58:05 crc kubenswrapper[4802]: I1201 19:58:05.720116 4802 scope.go:117] "RemoveContainer" containerID="5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d" Dec 01 19:58:05 crc kubenswrapper[4802]: E1201 19:58:05.720313 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-t7nr2_openshift-ovn-kubernetes(933fb25a-a01a-464e-838a-df1d07bca99e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" Dec 01 19:58:06 crc kubenswrapper[4802]: I1201 19:58:06.719699 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:58:06 crc kubenswrapper[4802]: I1201 19:58:06.719805 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:58:06 crc kubenswrapper[4802]: I1201 19:58:06.719718 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:58:06 crc kubenswrapper[4802]: E1201 19:58:06.719936 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:58:06 crc kubenswrapper[4802]: E1201 19:58:06.720033 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:58:06 crc kubenswrapper[4802]: E1201 19:58:06.720475 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:58:07 crc kubenswrapper[4802]: I1201 19:58:07.719912 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:58:07 crc kubenswrapper[4802]: E1201 19:58:07.720075 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:58:08 crc kubenswrapper[4802]: I1201 19:58:08.719298 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:58:08 crc kubenswrapper[4802]: I1201 19:58:08.719422 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:58:08 crc kubenswrapper[4802]: E1201 19:58:08.721411 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:58:08 crc kubenswrapper[4802]: I1201 19:58:08.721455 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:58:08 crc kubenswrapper[4802]: E1201 19:58:08.721626 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:58:08 crc kubenswrapper[4802]: E1201 19:58:08.721865 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:58:09 crc kubenswrapper[4802]: I1201 19:58:09.719438 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:58:09 crc kubenswrapper[4802]: E1201 19:58:09.719860 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:58:10 crc kubenswrapper[4802]: I1201 19:58:10.719146 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:58:10 crc kubenswrapper[4802]: I1201 19:58:10.719256 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:58:10 crc kubenswrapper[4802]: I1201 19:58:10.719400 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:58:10 crc kubenswrapper[4802]: E1201 19:58:10.719403 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:58:10 crc kubenswrapper[4802]: E1201 19:58:10.719621 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:58:10 crc kubenswrapper[4802]: E1201 19:58:10.719785 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:58:11 crc kubenswrapper[4802]: I1201 19:58:11.327019 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8zl28_bd82ca15-4489-4c15-aaf0-afb6b6787dc6/kube-multus/1.log" Dec 01 19:58:11 crc kubenswrapper[4802]: I1201 19:58:11.327842 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8zl28_bd82ca15-4489-4c15-aaf0-afb6b6787dc6/kube-multus/0.log" Dec 01 19:58:11 crc kubenswrapper[4802]: I1201 19:58:11.327922 4802 generic.go:334] "Generic (PLEG): container finished" podID="bd82ca15-4489-4c15-aaf0-afb6b6787dc6" containerID="8bca0d5f90fcacce4e41c5c25c3297ad735feaac885581866866378106ae6d64" exitCode=1 Dec 01 19:58:11 crc kubenswrapper[4802]: I1201 19:58:11.327964 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8zl28" event={"ID":"bd82ca15-4489-4c15-aaf0-afb6b6787dc6","Type":"ContainerDied","Data":"8bca0d5f90fcacce4e41c5c25c3297ad735feaac885581866866378106ae6d64"} Dec 01 19:58:11 crc kubenswrapper[4802]: I1201 19:58:11.328020 4802 scope.go:117] "RemoveContainer" containerID="f02e146ee1bd8c3eab64647139013e81f9bac99a984241d3b7016afb3e328ef3" Dec 01 19:58:11 crc kubenswrapper[4802]: I1201 19:58:11.328604 4802 scope.go:117] "RemoveContainer" containerID="8bca0d5f90fcacce4e41c5c25c3297ad735feaac885581866866378106ae6d64" Dec 01 19:58:11 crc kubenswrapper[4802]: E1201 19:58:11.328863 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-8zl28_openshift-multus(bd82ca15-4489-4c15-aaf0-afb6b6787dc6)\"" pod="openshift-multus/multus-8zl28" podUID="bd82ca15-4489-4c15-aaf0-afb6b6787dc6" Dec 01 19:58:11 crc kubenswrapper[4802]: I1201 19:58:11.719734 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:58:11 crc kubenswrapper[4802]: E1201 19:58:11.719976 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:58:12 crc kubenswrapper[4802]: I1201 19:58:12.334294 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8zl28_bd82ca15-4489-4c15-aaf0-afb6b6787dc6/kube-multus/1.log" Dec 01 19:58:12 crc kubenswrapper[4802]: I1201 19:58:12.720028 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:58:12 crc kubenswrapper[4802]: I1201 19:58:12.720167 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:58:12 crc kubenswrapper[4802]: I1201 19:58:12.720345 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:58:12 crc kubenswrapper[4802]: E1201 19:58:12.720226 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:58:12 crc kubenswrapper[4802]: E1201 19:58:12.720433 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:58:12 crc kubenswrapper[4802]: E1201 19:58:12.720569 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:58:13 crc kubenswrapper[4802]: I1201 19:58:13.719708 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:58:13 crc kubenswrapper[4802]: E1201 19:58:13.719910 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:58:14 crc kubenswrapper[4802]: I1201 19:58:14.719293 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:58:14 crc kubenswrapper[4802]: I1201 19:58:14.719367 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:58:14 crc kubenswrapper[4802]: E1201 19:58:14.719399 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:58:14 crc kubenswrapper[4802]: E1201 19:58:14.719560 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:58:14 crc kubenswrapper[4802]: I1201 19:58:14.719858 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:58:14 crc kubenswrapper[4802]: E1201 19:58:14.720027 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:58:15 crc kubenswrapper[4802]: I1201 19:58:15.719640 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:58:15 crc kubenswrapper[4802]: E1201 19:58:15.719924 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:58:16 crc kubenswrapper[4802]: I1201 19:58:16.720031 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:58:16 crc kubenswrapper[4802]: I1201 19:58:16.720100 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:58:16 crc kubenswrapper[4802]: I1201 19:58:16.720108 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:58:16 crc kubenswrapper[4802]: E1201 19:58:16.720251 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:58:16 crc kubenswrapper[4802]: E1201 19:58:16.720315 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:58:16 crc kubenswrapper[4802]: E1201 19:58:16.720407 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:58:16 crc kubenswrapper[4802]: I1201 19:58:16.721969 4802 scope.go:117] "RemoveContainer" containerID="5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d" Dec 01 19:58:16 crc kubenswrapper[4802]: E1201 19:58:16.722342 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-t7nr2_openshift-ovn-kubernetes(933fb25a-a01a-464e-838a-df1d07bca99e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" Dec 01 19:58:17 crc kubenswrapper[4802]: I1201 19:58:17.719674 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:58:17 crc kubenswrapper[4802]: E1201 19:58:17.719906 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:58:18 crc kubenswrapper[4802]: I1201 19:58:18.719019 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:58:18 crc kubenswrapper[4802]: E1201 19:58:18.720665 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:58:18 crc kubenswrapper[4802]: I1201 19:58:18.720699 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:58:18 crc kubenswrapper[4802]: E1201 19:58:18.720968 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:58:18 crc kubenswrapper[4802]: I1201 19:58:18.720825 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:58:18 crc kubenswrapper[4802]: E1201 19:58:18.721226 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:58:18 crc kubenswrapper[4802]: E1201 19:58:18.740290 4802 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 01 19:58:18 crc kubenswrapper[4802]: E1201 19:58:18.819439 4802 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 19:58:19 crc kubenswrapper[4802]: I1201 19:58:19.719504 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:58:19 crc kubenswrapper[4802]: E1201 19:58:19.719663 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:58:20 crc kubenswrapper[4802]: I1201 19:58:20.719562 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:58:20 crc kubenswrapper[4802]: I1201 19:58:20.719722 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:58:20 crc kubenswrapper[4802]: I1201 19:58:20.719779 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:58:20 crc kubenswrapper[4802]: E1201 19:58:20.719989 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:58:20 crc kubenswrapper[4802]: E1201 19:58:20.720168 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:58:20 crc kubenswrapper[4802]: E1201 19:58:20.720334 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:58:21 crc kubenswrapper[4802]: I1201 19:58:21.719924 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:58:21 crc kubenswrapper[4802]: E1201 19:58:21.720103 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:58:21 crc kubenswrapper[4802]: I1201 19:58:21.720660 4802 scope.go:117] "RemoveContainer" containerID="8bca0d5f90fcacce4e41c5c25c3297ad735feaac885581866866378106ae6d64" Dec 01 19:58:22 crc kubenswrapper[4802]: I1201 19:58:22.375611 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8zl28_bd82ca15-4489-4c15-aaf0-afb6b6787dc6/kube-multus/1.log" Dec 01 19:58:22 crc kubenswrapper[4802]: I1201 19:58:22.376064 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8zl28" event={"ID":"bd82ca15-4489-4c15-aaf0-afb6b6787dc6","Type":"ContainerStarted","Data":"ffc9ffc722d4b9500e320b758594376eca2d1523039741ca000571e4d2a4865b"} Dec 01 19:58:22 crc kubenswrapper[4802]: I1201 19:58:22.718993 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:58:22 crc kubenswrapper[4802]: I1201 19:58:22.719126 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:58:22 crc kubenswrapper[4802]: E1201 19:58:22.719229 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:58:22 crc kubenswrapper[4802]: E1201 19:58:22.719361 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:58:22 crc kubenswrapper[4802]: I1201 19:58:22.720154 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:58:22 crc kubenswrapper[4802]: E1201 19:58:22.720359 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:58:23 crc kubenswrapper[4802]: I1201 19:58:23.719772 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:58:23 crc kubenswrapper[4802]: E1201 19:58:23.719972 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:58:23 crc kubenswrapper[4802]: E1201 19:58:23.820550 4802 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 19:58:24 crc kubenswrapper[4802]: I1201 19:58:24.719034 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:58:24 crc kubenswrapper[4802]: I1201 19:58:24.719107 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:58:24 crc kubenswrapper[4802]: E1201 19:58:24.719309 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:58:24 crc kubenswrapper[4802]: I1201 19:58:24.719369 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:58:24 crc kubenswrapper[4802]: E1201 19:58:24.719532 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:58:24 crc kubenswrapper[4802]: E1201 19:58:24.719629 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:58:25 crc kubenswrapper[4802]: I1201 19:58:25.719421 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:58:25 crc kubenswrapper[4802]: E1201 19:58:25.719672 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:58:26 crc kubenswrapper[4802]: I1201 19:58:26.718976 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:58:26 crc kubenswrapper[4802]: E1201 19:58:26.719252 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:58:26 crc kubenswrapper[4802]: I1201 19:58:26.719338 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:58:26 crc kubenswrapper[4802]: E1201 19:58:26.719483 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:58:26 crc kubenswrapper[4802]: I1201 19:58:26.719339 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:58:26 crc kubenswrapper[4802]: E1201 19:58:26.719609 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:58:27 crc kubenswrapper[4802]: I1201 19:58:27.719347 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:58:27 crc kubenswrapper[4802]: E1201 19:58:27.720040 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:58:28 crc kubenswrapper[4802]: I1201 19:58:28.719951 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:58:28 crc kubenswrapper[4802]: I1201 19:58:28.719999 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:58:28 crc kubenswrapper[4802]: I1201 19:58:28.719942 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:58:28 crc kubenswrapper[4802]: E1201 19:58:28.721553 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:58:28 crc kubenswrapper[4802]: E1201 19:58:28.721688 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:58:28 crc kubenswrapper[4802]: E1201 19:58:28.721797 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:58:28 crc kubenswrapper[4802]: E1201 19:58:28.824217 4802 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 19:58:29 crc kubenswrapper[4802]: I1201 19:58:29.719152 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:58:29 crc kubenswrapper[4802]: E1201 19:58:29.719338 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:58:30 crc kubenswrapper[4802]: I1201 19:58:30.720159 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:58:30 crc kubenswrapper[4802]: I1201 19:58:30.720281 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:58:30 crc kubenswrapper[4802]: I1201 19:58:30.720323 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:58:30 crc kubenswrapper[4802]: E1201 19:58:30.720498 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:58:30 crc kubenswrapper[4802]: E1201 19:58:30.721155 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:58:30 crc kubenswrapper[4802]: E1201 19:58:30.721383 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:58:30 crc kubenswrapper[4802]: I1201 19:58:30.721697 4802 scope.go:117] "RemoveContainer" containerID="5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d" Dec 01 19:58:30 crc kubenswrapper[4802]: E1201 19:58:30.722230 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-t7nr2_openshift-ovn-kubernetes(933fb25a-a01a-464e-838a-df1d07bca99e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" Dec 01 19:58:31 crc kubenswrapper[4802]: I1201 19:58:31.719569 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:58:31 crc kubenswrapper[4802]: E1201 19:58:31.720471 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:58:32 crc kubenswrapper[4802]: I1201 19:58:32.720101 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:58:32 crc kubenswrapper[4802]: E1201 19:58:32.720352 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:58:32 crc kubenswrapper[4802]: I1201 19:58:32.720416 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:58:32 crc kubenswrapper[4802]: I1201 19:58:32.720550 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:58:32 crc kubenswrapper[4802]: E1201 19:58:32.720603 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:58:32 crc kubenswrapper[4802]: E1201 19:58:32.720855 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:58:33 crc kubenswrapper[4802]: I1201 19:58:33.719409 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:58:33 crc kubenswrapper[4802]: E1201 19:58:33.719649 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:58:33 crc kubenswrapper[4802]: E1201 19:58:33.825912 4802 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 19:58:34 crc kubenswrapper[4802]: I1201 19:58:34.719967 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:58:34 crc kubenswrapper[4802]: I1201 19:58:34.720009 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:58:34 crc kubenswrapper[4802]: I1201 19:58:34.719976 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:58:34 crc kubenswrapper[4802]: E1201 19:58:34.720270 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:58:34 crc kubenswrapper[4802]: E1201 19:58:34.720761 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:58:34 crc kubenswrapper[4802]: E1201 19:58:34.720856 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:58:35 crc kubenswrapper[4802]: I1201 19:58:35.719949 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:58:35 crc kubenswrapper[4802]: E1201 19:58:35.720136 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:58:36 crc kubenswrapper[4802]: I1201 19:58:36.719785 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:58:36 crc kubenswrapper[4802]: E1201 19:58:36.720014 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:58:36 crc kubenswrapper[4802]: I1201 19:58:36.720073 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:58:36 crc kubenswrapper[4802]: I1201 19:58:36.720130 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:58:36 crc kubenswrapper[4802]: E1201 19:58:36.720311 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:58:36 crc kubenswrapper[4802]: E1201 19:58:36.720483 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:58:37 crc kubenswrapper[4802]: I1201 19:58:37.719988 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:58:37 crc kubenswrapper[4802]: E1201 19:58:37.720177 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:58:38 crc kubenswrapper[4802]: I1201 19:58:38.719929 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:58:38 crc kubenswrapper[4802]: I1201 19:58:38.719978 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:58:38 crc kubenswrapper[4802]: E1201 19:58:38.722268 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:58:38 crc kubenswrapper[4802]: I1201 19:58:38.722283 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:58:38 crc kubenswrapper[4802]: E1201 19:58:38.722377 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:58:38 crc kubenswrapper[4802]: E1201 19:58:38.722468 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:58:38 crc kubenswrapper[4802]: E1201 19:58:38.826868 4802 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 19:58:39 crc kubenswrapper[4802]: I1201 19:58:39.720015 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:58:39 crc kubenswrapper[4802]: E1201 19:58:39.720242 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:58:40 crc kubenswrapper[4802]: I1201 19:58:40.719709 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:58:40 crc kubenswrapper[4802]: I1201 19:58:40.719855 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:58:40 crc kubenswrapper[4802]: E1201 19:58:40.719942 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:58:40 crc kubenswrapper[4802]: I1201 19:58:40.719717 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:58:40 crc kubenswrapper[4802]: E1201 19:58:40.720083 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:58:40 crc kubenswrapper[4802]: E1201 19:58:40.720163 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:58:41 crc kubenswrapper[4802]: I1201 19:58:41.719533 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:58:41 crc kubenswrapper[4802]: E1201 19:58:41.719778 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:58:42 crc kubenswrapper[4802]: I1201 19:58:42.719452 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:58:42 crc kubenswrapper[4802]: I1201 19:58:42.719542 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:58:42 crc kubenswrapper[4802]: E1201 19:58:42.720047 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:58:42 crc kubenswrapper[4802]: I1201 19:58:42.719609 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:58:42 crc kubenswrapper[4802]: E1201 19:58:42.720181 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:58:42 crc kubenswrapper[4802]: E1201 19:58:42.720280 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:58:42 crc kubenswrapper[4802]: I1201 19:58:42.720924 4802 scope.go:117] "RemoveContainer" containerID="5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d" Dec 01 19:58:43 crc kubenswrapper[4802]: I1201 19:58:43.456140 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7nr2_933fb25a-a01a-464e-838a-df1d07bca99e/ovnkube-controller/3.log" Dec 01 19:58:43 crc kubenswrapper[4802]: I1201 19:58:43.460236 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" event={"ID":"933fb25a-a01a-464e-838a-df1d07bca99e","Type":"ContainerStarted","Data":"f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb"} Dec 01 19:58:43 crc kubenswrapper[4802]: I1201 19:58:43.460799 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:58:43 crc kubenswrapper[4802]: I1201 19:58:43.508802 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" podStartSLOduration=125.508774256 podStartE2EDuration="2m5.508774256s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:43.506973429 +0000 UTC m=+145.069533070" watchObservedRunningTime="2025-12-01 19:58:43.508774256 +0000 UTC m=+145.071333917" Dec 01 19:58:43 crc kubenswrapper[4802]: I1201 19:58:43.667930 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-p8cs7"] Dec 01 19:58:43 crc kubenswrapper[4802]: I1201 19:58:43.668154 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:58:43 crc kubenswrapper[4802]: E1201 19:58:43.668362 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:58:43 crc kubenswrapper[4802]: I1201 19:58:43.719822 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:58:43 crc kubenswrapper[4802]: E1201 19:58:43.720033 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:58:43 crc kubenswrapper[4802]: E1201 19:58:43.829043 4802 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 19:58:44 crc kubenswrapper[4802]: I1201 19:58:44.614093 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:58:44 crc kubenswrapper[4802]: E1201 19:58:44.614486 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 20:00:46.614428321 +0000 UTC m=+268.176987992 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:44 crc kubenswrapper[4802]: I1201 19:58:44.614580 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:58:44 crc kubenswrapper[4802]: I1201 19:58:44.614752 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:58:44 crc kubenswrapper[4802]: E1201 19:58:44.614919 4802 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 19:58:44 crc kubenswrapper[4802]: E1201 19:58:44.615059 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 20:00:46.615025379 +0000 UTC m=+268.177585050 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 19:58:44 crc kubenswrapper[4802]: E1201 19:58:44.614933 4802 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 19:58:44 crc kubenswrapper[4802]: E1201 19:58:44.615241 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 20:00:46.615167673 +0000 UTC m=+268.177727354 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 19:58:44 crc kubenswrapper[4802]: I1201 19:58:44.715887 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:58:44 crc kubenswrapper[4802]: I1201 19:58:44.715967 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:58:44 crc kubenswrapper[4802]: E1201 19:58:44.716169 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 19:58:44 crc kubenswrapper[4802]: E1201 19:58:44.716208 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 19:58:44 crc kubenswrapper[4802]: E1201 19:58:44.716226 4802 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:58:44 crc kubenswrapper[4802]: E1201 19:58:44.716246 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 19:58:44 crc kubenswrapper[4802]: E1201 19:58:44.716348 4802 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 19:58:44 crc kubenswrapper[4802]: E1201 19:58:44.716374 4802 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:58:44 crc kubenswrapper[4802]: E1201 19:58:44.716317 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 20:00:46.716295644 +0000 UTC m=+268.278855295 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:58:44 crc kubenswrapper[4802]: E1201 19:58:44.716523 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 20:00:46.71648157 +0000 UTC m=+268.279041251 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 19:58:44 crc kubenswrapper[4802]: I1201 19:58:44.719734 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:58:44 crc kubenswrapper[4802]: I1201 19:58:44.719761 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:58:44 crc kubenswrapper[4802]: E1201 19:58:44.720031 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:58:44 crc kubenswrapper[4802]: E1201 19:58:44.720092 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:58:45 crc kubenswrapper[4802]: I1201 19:58:45.719618 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:58:45 crc kubenswrapper[4802]: I1201 19:58:45.719704 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:58:45 crc kubenswrapper[4802]: E1201 19:58:45.719798 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:58:45 crc kubenswrapper[4802]: E1201 19:58:45.719928 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:58:46 crc kubenswrapper[4802]: I1201 19:58:46.719510 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:58:46 crc kubenswrapper[4802]: I1201 19:58:46.719554 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:58:46 crc kubenswrapper[4802]: E1201 19:58:46.719768 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:58:46 crc kubenswrapper[4802]: E1201 19:58:46.719899 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:58:47 crc kubenswrapper[4802]: I1201 19:58:47.719681 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:58:47 crc kubenswrapper[4802]: I1201 19:58:47.719774 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:58:47 crc kubenswrapper[4802]: E1201 19:58:47.720234 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 19:58:47 crc kubenswrapper[4802]: E1201 19:58:47.720368 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8cs7" podUID="008be62d-2cef-42a3-912f-2b2e58f8e30b" Dec 01 19:58:48 crc kubenswrapper[4802]: I1201 19:58:48.719583 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:58:48 crc kubenswrapper[4802]: I1201 19:58:48.719612 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:58:48 crc kubenswrapper[4802]: E1201 19:58:48.720435 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 19:58:48 crc kubenswrapper[4802]: E1201 19:58:48.720533 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 19:58:49 crc kubenswrapper[4802]: I1201 19:58:49.719289 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:58:49 crc kubenswrapper[4802]: I1201 19:58:49.719424 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 19:58:49 crc kubenswrapper[4802]: I1201 19:58:49.723352 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 01 19:58:49 crc kubenswrapper[4802]: I1201 19:58:49.723370 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 01 19:58:49 crc kubenswrapper[4802]: I1201 19:58:49.723631 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 01 19:58:49 crc kubenswrapper[4802]: I1201 19:58:49.723738 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 01 19:58:50 crc kubenswrapper[4802]: I1201 19:58:50.719803 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 19:58:50 crc kubenswrapper[4802]: I1201 19:58:50.720187 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 19:58:50 crc kubenswrapper[4802]: I1201 19:58:50.723143 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 01 19:58:50 crc kubenswrapper[4802]: I1201 19:58:50.723704 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.713984 4802 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.755224 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.756187 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hv2h9"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.756360 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.756749 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hv2h9" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.757267 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-r9w8p"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.758057 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r9w8p" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.758111 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-62kxj"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.758689 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-62kxj" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.758835 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.764363 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.764373 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.764779 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.764419 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.764486 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.765169 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qfg52"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.765714 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-qfg52" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.766410 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.766532 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.767354 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.767437 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.767547 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.767758 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.767988 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.768247 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.768431 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.769634 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c6jk6"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.770244 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c6jk6" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.770344 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.770441 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.770600 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.775424 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-g42nq"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.775742 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.776095 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-g42nq" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.778173 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.778236 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.781970 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.782246 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.784099 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.784439 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.784601 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.784819 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.784923 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.784985 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.786441 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7cxs2"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.787101 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-7cxs2" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.800242 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.800696 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mx65z"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.802352 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.802894 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.802986 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mx65z" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.803977 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.804600 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.805345 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.805624 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.807391 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.824463 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.824791 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.825764 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.825902 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.826083 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.826121 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.826245 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.826313 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.827441 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.827630 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.827818 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.828074 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.829942 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-gjlgw"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.830636 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-gjlgw" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.831093 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.833258 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.833745 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.833835 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.833898 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.833750 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.834938 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x847"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.836015 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nqwc4"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.836720 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x847" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.836822 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nsqcw"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.837523 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.838158 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.838972 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nqwc4" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.846795 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pl7mh"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.847588 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8dcm"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.847920 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2pb67"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.848403 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2pb67" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.848535 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pl7mh" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.849074 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8dcm" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.849735 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.853482 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.853682 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.853930 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.857614 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6bdr9"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.858113 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-cdzf5"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.858230 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6bdr9" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.859425 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-4czjq"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.859750 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cdzf5" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.859982 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.860528 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h44gr"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.861144 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h44gr" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.861368 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-s8fb5"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.861708 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-s8fb5" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.868740 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnwr8"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.869737 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dn99n"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.870344 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dn99n" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.870766 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnwr8" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.873516 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.873549 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.874028 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.875490 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-lvx7b"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.886784 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kbqcj"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.887738 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-lvx7b" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.888319 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kbqcj" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.891684 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.892465 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.892748 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.893092 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.893306 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.893429 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.893732 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.893984 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.895792 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dml5n"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.897065 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.912690 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.913164 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.913289 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.913405 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.913659 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.913935 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.914087 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.914220 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.913944 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.914380 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.914624 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.914723 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.914637 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.914687 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.915168 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.915176 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.915174 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.915271 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.915779 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.915810 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.915985 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.915792 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.916323 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dml5n" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.916751 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.916965 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.917156 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.917253 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.917280 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.917428 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.917591 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-w2f2s"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.918500 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.919127 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.919159 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w2f2s" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.921139 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sdh6r"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.921883 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sdh6r" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.922191 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-287bb"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.922712 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-287bb" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.923462 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zqg5r"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.924357 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.927251 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.930534 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mn4k7"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.930627 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.932514 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.933272 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mn4k7" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.933991 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410305-db4dv"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.934942 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410305-db4dv" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.937532 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/099f3158-3fcc-4d67-9ee2-1a9229e1ad23-audit-policies\") pod \"apiserver-7bbb656c7d-dgmgt\" (UID: \"099f3158-3fcc-4d67-9ee2-1a9229e1ad23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.937570 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a4d8cc9-1731-4087-8e6f-ac0e184616fe-serving-cert\") pod \"openshift-config-operator-7777fb866f-r9w8p\" (UID: \"5a4d8cc9-1731-4087-8e6f-ac0e184616fe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r9w8p" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.937601 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/099f3158-3fcc-4d67-9ee2-1a9229e1ad23-encryption-config\") pod \"apiserver-7bbb656c7d-dgmgt\" (UID: \"099f3158-3fcc-4d67-9ee2-1a9229e1ad23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.937623 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh722\" (UniqueName: \"kubernetes.io/projected/099f3158-3fcc-4d67-9ee2-1a9229e1ad23-kube-api-access-nh722\") pod \"apiserver-7bbb656c7d-dgmgt\" (UID: \"099f3158-3fcc-4d67-9ee2-1a9229e1ad23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.937649 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d2c2d1ec-c588-4247-aae2-c228404a38e0-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mx65z\" (UID: \"d2c2d1ec-c588-4247-aae2-c228404a38e0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mx65z" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.937681 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zllm9\" (UniqueName: \"kubernetes.io/projected/fcf38656-7a15-4cd1-9038-83272327ce3c-kube-api-access-zllm9\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.937704 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv86v\" (UniqueName: \"kubernetes.io/projected/417dfff6-8c73-474a-8ec2-aadab4e32131-kube-api-access-wv86v\") pod \"console-operator-58897d9998-qfg52\" (UID: \"417dfff6-8c73-474a-8ec2-aadab4e32131\") " pod="openshift-console-operator/console-operator-58897d9998-qfg52" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.937727 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5217fe99-8018-4db8-8f1d-2e0b33056638-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5x847\" (UID: \"5217fe99-8018-4db8-8f1d-2e0b33056638\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x847" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.937752 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.937777 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/417dfff6-8c73-474a-8ec2-aadab4e32131-config\") pod \"console-operator-58897d9998-qfg52\" (UID: \"417dfff6-8c73-474a-8ec2-aadab4e32131\") " pod="openshift-console-operator/console-operator-58897d9998-qfg52" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.937797 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/099f3158-3fcc-4d67-9ee2-1a9229e1ad23-audit-dir\") pod \"apiserver-7bbb656c7d-dgmgt\" (UID: \"099f3158-3fcc-4d67-9ee2-1a9229e1ad23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.937814 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.937831 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.937849 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-service-ca\") pod \"console-f9d7485db-62kxj\" (UID: \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\") " pod="openshift-console/console-f9d7485db-62kxj" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.937903 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6btwh\" (UniqueName: \"kubernetes.io/projected/1cb3789f-6490-4c1c-a5a8-54f3d9ea3bb0-kube-api-access-6btwh\") pod \"migrator-59844c95c7-nqwc4\" (UID: \"1cb3789f-6490-4c1c-a5a8-54f3d9ea3bb0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nqwc4" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.937982 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/099f3158-3fcc-4d67-9ee2-1a9229e1ad23-serving-cert\") pod \"apiserver-7bbb656c7d-dgmgt\" (UID: \"099f3158-3fcc-4d67-9ee2-1a9229e1ad23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.938035 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5217fe99-8018-4db8-8f1d-2e0b33056638-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5x847\" (UID: \"5217fe99-8018-4db8-8f1d-2e0b33056638\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x847" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.938078 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/417dfff6-8c73-474a-8ec2-aadab4e32131-serving-cert\") pod \"console-operator-58897d9998-qfg52\" (UID: \"417dfff6-8c73-474a-8ec2-aadab4e32131\") " pod="openshift-console-operator/console-operator-58897d9998-qfg52" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.938097 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkmh7\" (UniqueName: \"kubernetes.io/projected/d2c2d1ec-c588-4247-aae2-c228404a38e0-kube-api-access-nkmh7\") pod \"machine-api-operator-5694c8668f-mx65z\" (UID: \"d2c2d1ec-c588-4247-aae2-c228404a38e0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mx65z" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.938125 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/46b256be-246d-43ad-99b9-7b67eca6762a-metrics-tls\") pod \"dns-operator-744455d44c-g42nq\" (UID: \"46b256be-246d-43ad-99b9-7b67eca6762a\") " pod="openshift-dns-operator/dns-operator-744455d44c-g42nq" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.938186 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b139dad-bbb0-4d0f-bd11-14f142ef1767-config\") pod \"controller-manager-879f6c89f-hv2h9\" (UID: \"6b139dad-bbb0-4d0f-bd11-14f142ef1767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hv2h9" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.938246 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fcf38656-7a15-4cd1-9038-83272327ce3c-audit-policies\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.938275 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.938325 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/099f3158-3fcc-4d67-9ee2-1a9229e1ad23-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dgmgt\" (UID: \"099f3158-3fcc-4d67-9ee2-1a9229e1ad23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.938359 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b139dad-bbb0-4d0f-bd11-14f142ef1767-serving-cert\") pod \"controller-manager-879f6c89f-hv2h9\" (UID: \"6b139dad-bbb0-4d0f-bd11-14f142ef1767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hv2h9" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.938380 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/417dfff6-8c73-474a-8ec2-aadab4e32131-trusted-ca\") pod \"console-operator-58897d9998-qfg52\" (UID: \"417dfff6-8c73-474a-8ec2-aadab4e32131\") " pod="openshift-console-operator/console-operator-58897d9998-qfg52" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.938549 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2-etcd-ca\") pod \"etcd-operator-b45778765-7cxs2\" (UID: \"21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7cxs2" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.938575 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.938605 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec8148e4-5519-4b98-a17f-b80f1a44a4da-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-c6jk6\" (UID: \"ec8148e4-5519-4b98-a17f-b80f1a44a4da\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c6jk6" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.938631 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/099f3158-3fcc-4d67-9ee2-1a9229e1ad23-etcd-client\") pod \"apiserver-7bbb656c7d-dgmgt\" (UID: \"099f3158-3fcc-4d67-9ee2-1a9229e1ad23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.938651 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.938676 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.938735 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkqq9\" (UniqueName: \"kubernetes.io/projected/6af22b89-378d-4aab-a028-2e19ec6e8d1c-kube-api-access-pkqq9\") pod \"downloads-7954f5f757-gjlgw\" (UID: \"6af22b89-378d-4aab-a028-2e19ec6e8d1c\") " pod="openshift-console/downloads-7954f5f757-gjlgw" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.938762 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fcf38656-7a15-4cd1-9038-83272327ce3c-audit-dir\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.938799 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.938818 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-trusted-ca-bundle\") pod \"console-f9d7485db-62kxj\" (UID: \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\") " pod="openshift-console/console-f9d7485db-62kxj" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.938854 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.938878 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.938950 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.939009 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg7t5\" (UniqueName: \"kubernetes.io/projected/5a4d8cc9-1731-4087-8e6f-ac0e184616fe-kube-api-access-cg7t5\") pod \"openshift-config-operator-7777fb866f-r9w8p\" (UID: \"5a4d8cc9-1731-4087-8e6f-ac0e184616fe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r9w8p" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.939046 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-742m8\" (UniqueName: \"kubernetes.io/projected/ec8148e4-5519-4b98-a17f-b80f1a44a4da-kube-api-access-742m8\") pod \"cluster-samples-operator-665b6dd947-c6jk6\" (UID: \"ec8148e4-5519-4b98-a17f-b80f1a44a4da\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c6jk6" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.939069 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2-etcd-client\") pod \"etcd-operator-b45778765-7cxs2\" (UID: \"21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7cxs2" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.939132 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2-config\") pod \"etcd-operator-b45778765-7cxs2\" (UID: \"21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7cxs2" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.939152 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-console-serving-cert\") pod \"console-f9d7485db-62kxj\" (UID: \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\") " pod="openshift-console/console-f9d7485db-62kxj" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.939179 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/099f3158-3fcc-4d67-9ee2-1a9229e1ad23-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dgmgt\" (UID: \"099f3158-3fcc-4d67-9ee2-1a9229e1ad23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.939245 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5217fe99-8018-4db8-8f1d-2e0b33056638-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5x847\" (UID: \"5217fe99-8018-4db8-8f1d-2e0b33056638\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x847" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.939274 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-console-config\") pod \"console-f9d7485db-62kxj\" (UID: \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\") " pod="openshift-console/console-f9d7485db-62kxj" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.939297 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm622\" (UniqueName: \"kubernetes.io/projected/6b139dad-bbb0-4d0f-bd11-14f142ef1767-kube-api-access-gm622\") pod \"controller-manager-879f6c89f-hv2h9\" (UID: \"6b139dad-bbb0-4d0f-bd11-14f142ef1767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hv2h9" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.939319 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mtnt\" (UniqueName: \"kubernetes.io/projected/46b256be-246d-43ad-99b9-7b67eca6762a-kube-api-access-4mtnt\") pod \"dns-operator-744455d44c-g42nq\" (UID: \"46b256be-246d-43ad-99b9-7b67eca6762a\") " pod="openshift-dns-operator/dns-operator-744455d44c-g42nq" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.939343 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2-etcd-service-ca\") pod \"etcd-operator-b45778765-7cxs2\" (UID: \"21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7cxs2" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.939359 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-console-oauth-config\") pod \"console-f9d7485db-62kxj\" (UID: \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\") " pod="openshift-console/console-f9d7485db-62kxj" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.939379 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c852k\" (UniqueName: \"kubernetes.io/projected/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-kube-api-access-c852k\") pod \"console-f9d7485db-62kxj\" (UID: \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\") " pod="openshift-console/console-f9d7485db-62kxj" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.939506 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bwjbd"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.939654 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d2c2d1ec-c588-4247-aae2-c228404a38e0-images\") pod \"machine-api-operator-5694c8668f-mx65z\" (UID: \"d2c2d1ec-c588-4247-aae2-c228404a38e0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mx65z" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.940392 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.940436 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6b139dad-bbb0-4d0f-bd11-14f142ef1767-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hv2h9\" (UID: \"6b139dad-bbb0-4d0f-bd11-14f142ef1767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hv2h9" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.940463 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2c2d1ec-c588-4247-aae2-c228404a38e0-config\") pod \"machine-api-operator-5694c8668f-mx65z\" (UID: \"d2c2d1ec-c588-4247-aae2-c228404a38e0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mx65z" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.940271 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bwjbd" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.940507 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsvtw\" (UniqueName: \"kubernetes.io/projected/21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2-kube-api-access-tsvtw\") pod \"etcd-operator-b45778765-7cxs2\" (UID: \"21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7cxs2" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.940544 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5a4d8cc9-1731-4087-8e6f-ac0e184616fe-available-featuregates\") pod \"openshift-config-operator-7777fb866f-r9w8p\" (UID: \"5a4d8cc9-1731-4087-8e6f-ac0e184616fe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r9w8p" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.940566 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-oauth-serving-cert\") pod \"console-f9d7485db-62kxj\" (UID: \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\") " pod="openshift-console/console-f9d7485db-62kxj" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.940599 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2-serving-cert\") pod \"etcd-operator-b45778765-7cxs2\" (UID: \"21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7cxs2" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.940643 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b139dad-bbb0-4d0f-bd11-14f142ef1767-client-ca\") pod \"controller-manager-879f6c89f-hv2h9\" (UID: \"6b139dad-bbb0-4d0f-bd11-14f142ef1767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hv2h9" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.940948 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6rtx7"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.941388 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6rtx7" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.942300 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5bdsk"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.942394 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.943290 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5bdsk" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.944622 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.947493 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cvht4"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.947658 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.949872 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cvht4" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.950355 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bjmdk"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.951979 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjmdk" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.952578 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hv2h9"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.954785 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-r9w8p"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.957687 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c6jk6"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.958806 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.960876 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-62kxj"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.962971 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.965230 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qfg52"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.974126 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-g7fvq"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.975847 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7cxs2"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.975954 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g7fvq" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.976297 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pl7mh"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.978411 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.979983 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8dcm"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.982052 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h44gr"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.984242 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nqwc4"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.984585 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2pb67"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.985790 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-g42nq"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.986991 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnwr8"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.988136 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kbqcj"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.989093 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dn99n"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.990290 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sdh6r"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.991494 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-gjlgw"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.992986 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6bdr9"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.994353 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zqg5r"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.995266 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410305-db4dv"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.997526 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mx65z"] Dec 01 19:58:53 crc kubenswrapper[4802]: I1201 19:58:53.998620 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-s8fb5"] Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:53.999981 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dml5n"] Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.000173 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.001303 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x847"] Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.002432 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nsqcw"] Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.003448 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mn4k7"] Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.004516 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-w2f2s"] Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.005826 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-287bb"] Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.007474 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-4czjq"] Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.008791 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dtz7g"] Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.009862 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dtz7g" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.009944 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bjmdk"] Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.011001 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6rtx7"] Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.012139 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm"] Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.013284 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5bdsk"] Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.014787 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cvht4"] Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.015989 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dtz7g"] Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.017343 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bwjbd"] Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.018847 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.019170 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-g7fvq"] Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.019953 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-t4kb6"] Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.020986 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-t4kb6" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.021378 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hqg8s"] Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.022496 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-hqg8s" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.022983 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hqg8s"] Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.044149 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b139dad-bbb0-4d0f-bd11-14f142ef1767-client-ca\") pod \"controller-manager-879f6c89f-hv2h9\" (UID: \"6b139dad-bbb0-4d0f-bd11-14f142ef1767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hv2h9" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.044230 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/099f3158-3fcc-4d67-9ee2-1a9229e1ad23-audit-policies\") pod \"apiserver-7bbb656c7d-dgmgt\" (UID: \"099f3158-3fcc-4d67-9ee2-1a9229e1ad23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.044848 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a4d8cc9-1731-4087-8e6f-ac0e184616fe-serving-cert\") pod \"openshift-config-operator-7777fb866f-r9w8p\" (UID: \"5a4d8cc9-1731-4087-8e6f-ac0e184616fe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r9w8p" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.045048 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/099f3158-3fcc-4d67-9ee2-1a9229e1ad23-encryption-config\") pod \"apiserver-7bbb656c7d-dgmgt\" (UID: \"099f3158-3fcc-4d67-9ee2-1a9229e1ad23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.045098 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh722\" (UniqueName: \"kubernetes.io/projected/099f3158-3fcc-4d67-9ee2-1a9229e1ad23-kube-api-access-nh722\") pod \"apiserver-7bbb656c7d-dgmgt\" (UID: \"099f3158-3fcc-4d67-9ee2-1a9229e1ad23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.045137 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zllm9\" (UniqueName: \"kubernetes.io/projected/fcf38656-7a15-4cd1-9038-83272327ce3c-kube-api-access-zllm9\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.045177 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv86v\" (UniqueName: \"kubernetes.io/projected/417dfff6-8c73-474a-8ec2-aadab4e32131-kube-api-access-wv86v\") pod \"console-operator-58897d9998-qfg52\" (UID: \"417dfff6-8c73-474a-8ec2-aadab4e32131\") " pod="openshift-console-operator/console-operator-58897d9998-qfg52" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.045231 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d2c2d1ec-c588-4247-aae2-c228404a38e0-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mx65z\" (UID: \"d2c2d1ec-c588-4247-aae2-c228404a38e0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mx65z" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.045268 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.045298 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/417dfff6-8c73-474a-8ec2-aadab4e32131-config\") pod \"console-operator-58897d9998-qfg52\" (UID: \"417dfff6-8c73-474a-8ec2-aadab4e32131\") " pod="openshift-console-operator/console-operator-58897d9998-qfg52" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.045335 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5217fe99-8018-4db8-8f1d-2e0b33056638-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5x847\" (UID: \"5217fe99-8018-4db8-8f1d-2e0b33056638\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x847" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.045374 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/099f3158-3fcc-4d67-9ee2-1a9229e1ad23-audit-dir\") pod \"apiserver-7bbb656c7d-dgmgt\" (UID: \"099f3158-3fcc-4d67-9ee2-1a9229e1ad23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.045415 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.045451 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.045480 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-service-ca\") pod \"console-f9d7485db-62kxj\" (UID: \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\") " pod="openshift-console/console-f9d7485db-62kxj" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.045527 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6btwh\" (UniqueName: \"kubernetes.io/projected/1cb3789f-6490-4c1c-a5a8-54f3d9ea3bb0-kube-api-access-6btwh\") pod \"migrator-59844c95c7-nqwc4\" (UID: \"1cb3789f-6490-4c1c-a5a8-54f3d9ea3bb0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nqwc4" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.045574 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/099f3158-3fcc-4d67-9ee2-1a9229e1ad23-serving-cert\") pod \"apiserver-7bbb656c7d-dgmgt\" (UID: \"099f3158-3fcc-4d67-9ee2-1a9229e1ad23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.045641 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5217fe99-8018-4db8-8f1d-2e0b33056638-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5x847\" (UID: \"5217fe99-8018-4db8-8f1d-2e0b33056638\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x847" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.045675 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/417dfff6-8c73-474a-8ec2-aadab4e32131-serving-cert\") pod \"console-operator-58897d9998-qfg52\" (UID: \"417dfff6-8c73-474a-8ec2-aadab4e32131\") " pod="openshift-console-operator/console-operator-58897d9998-qfg52" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.045718 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b139dad-bbb0-4d0f-bd11-14f142ef1767-config\") pod \"controller-manager-879f6c89f-hv2h9\" (UID: \"6b139dad-bbb0-4d0f-bd11-14f142ef1767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hv2h9" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.045765 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkmh7\" (UniqueName: \"kubernetes.io/projected/d2c2d1ec-c588-4247-aae2-c228404a38e0-kube-api-access-nkmh7\") pod \"machine-api-operator-5694c8668f-mx65z\" (UID: \"d2c2d1ec-c588-4247-aae2-c228404a38e0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mx65z" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.045800 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/46b256be-246d-43ad-99b9-7b67eca6762a-metrics-tls\") pod \"dns-operator-744455d44c-g42nq\" (UID: \"46b256be-246d-43ad-99b9-7b67eca6762a\") " pod="openshift-dns-operator/dns-operator-744455d44c-g42nq" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.045830 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fcf38656-7a15-4cd1-9038-83272327ce3c-audit-policies\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.045859 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.045892 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/099f3158-3fcc-4d67-9ee2-1a9229e1ad23-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dgmgt\" (UID: \"099f3158-3fcc-4d67-9ee2-1a9229e1ad23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.045918 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b139dad-bbb0-4d0f-bd11-14f142ef1767-serving-cert\") pod \"controller-manager-879f6c89f-hv2h9\" (UID: \"6b139dad-bbb0-4d0f-bd11-14f142ef1767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hv2h9" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.045946 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2-etcd-ca\") pod \"etcd-operator-b45778765-7cxs2\" (UID: \"21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7cxs2" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.045980 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.046063 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec8148e4-5519-4b98-a17f-b80f1a44a4da-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-c6jk6\" (UID: \"ec8148e4-5519-4b98-a17f-b80f1a44a4da\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c6jk6" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.046091 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/417dfff6-8c73-474a-8ec2-aadab4e32131-trusted-ca\") pod \"console-operator-58897d9998-qfg52\" (UID: \"417dfff6-8c73-474a-8ec2-aadab4e32131\") " pod="openshift-console-operator/console-operator-58897d9998-qfg52" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.046117 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/099f3158-3fcc-4d67-9ee2-1a9229e1ad23-etcd-client\") pod \"apiserver-7bbb656c7d-dgmgt\" (UID: \"099f3158-3fcc-4d67-9ee2-1a9229e1ad23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.046148 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.046188 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkqq9\" (UniqueName: \"kubernetes.io/projected/6af22b89-378d-4aab-a028-2e19ec6e8d1c-kube-api-access-pkqq9\") pod \"downloads-7954f5f757-gjlgw\" (UID: \"6af22b89-378d-4aab-a028-2e19ec6e8d1c\") " pod="openshift-console/downloads-7954f5f757-gjlgw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.046236 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fcf38656-7a15-4cd1-9038-83272327ce3c-audit-dir\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.046267 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.046288 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-trusted-ca-bundle\") pod \"console-f9d7485db-62kxj\" (UID: \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\") " pod="openshift-console/console-f9d7485db-62kxj" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.046317 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.046355 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvnbs\" (UniqueName: \"kubernetes.io/projected/d8d9c52e-a041-4e4c-a364-ef09f105a206-kube-api-access-nvnbs\") pod \"control-plane-machine-set-operator-78cbb6b69f-287bb\" (UID: \"d8d9c52e-a041-4e4c-a364-ef09f105a206\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-287bb" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.046390 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.046412 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.046481 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/099f3158-3fcc-4d67-9ee2-1a9229e1ad23-audit-policies\") pod \"apiserver-7bbb656c7d-dgmgt\" (UID: \"099f3158-3fcc-4d67-9ee2-1a9229e1ad23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.046604 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg7t5\" (UniqueName: \"kubernetes.io/projected/5a4d8cc9-1731-4087-8e6f-ac0e184616fe-kube-api-access-cg7t5\") pod \"openshift-config-operator-7777fb866f-r9w8p\" (UID: \"5a4d8cc9-1731-4087-8e6f-ac0e184616fe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r9w8p" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.046639 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-742m8\" (UniqueName: \"kubernetes.io/projected/ec8148e4-5519-4b98-a17f-b80f1a44a4da-kube-api-access-742m8\") pod \"cluster-samples-operator-665b6dd947-c6jk6\" (UID: \"ec8148e4-5519-4b98-a17f-b80f1a44a4da\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c6jk6" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.046698 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2-etcd-client\") pod \"etcd-operator-b45778765-7cxs2\" (UID: \"21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7cxs2" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.046732 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/099f3158-3fcc-4d67-9ee2-1a9229e1ad23-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dgmgt\" (UID: \"099f3158-3fcc-4d67-9ee2-1a9229e1ad23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.046754 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5217fe99-8018-4db8-8f1d-2e0b33056638-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5x847\" (UID: \"5217fe99-8018-4db8-8f1d-2e0b33056638\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x847" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.046778 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2-config\") pod \"etcd-operator-b45778765-7cxs2\" (UID: \"21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7cxs2" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.046802 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-console-serving-cert\") pod \"console-f9d7485db-62kxj\" (UID: \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\") " pod="openshift-console/console-f9d7485db-62kxj" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.046828 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-console-config\") pod \"console-f9d7485db-62kxj\" (UID: \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\") " pod="openshift-console/console-f9d7485db-62kxj" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.046856 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm622\" (UniqueName: \"kubernetes.io/projected/6b139dad-bbb0-4d0f-bd11-14f142ef1767-kube-api-access-gm622\") pod \"controller-manager-879f6c89f-hv2h9\" (UID: \"6b139dad-bbb0-4d0f-bd11-14f142ef1767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hv2h9" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.046878 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mtnt\" (UniqueName: \"kubernetes.io/projected/46b256be-246d-43ad-99b9-7b67eca6762a-kube-api-access-4mtnt\") pod \"dns-operator-744455d44c-g42nq\" (UID: \"46b256be-246d-43ad-99b9-7b67eca6762a\") " pod="openshift-dns-operator/dns-operator-744455d44c-g42nq" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.046905 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2-etcd-service-ca\") pod \"etcd-operator-b45778765-7cxs2\" (UID: \"21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7cxs2" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.046929 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-console-oauth-config\") pod \"console-f9d7485db-62kxj\" (UID: \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\") " pod="openshift-console/console-f9d7485db-62kxj" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.046955 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c852k\" (UniqueName: \"kubernetes.io/projected/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-kube-api-access-c852k\") pod \"console-f9d7485db-62kxj\" (UID: \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\") " pod="openshift-console/console-f9d7485db-62kxj" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.046976 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d2c2d1ec-c588-4247-aae2-c228404a38e0-images\") pod \"machine-api-operator-5694c8668f-mx65z\" (UID: \"d2c2d1ec-c588-4247-aae2-c228404a38e0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mx65z" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.047005 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8d9c52e-a041-4e4c-a364-ef09f105a206-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-287bb\" (UID: \"d8d9c52e-a041-4e4c-a364-ef09f105a206\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-287bb" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.047041 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6b139dad-bbb0-4d0f-bd11-14f142ef1767-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hv2h9\" (UID: \"6b139dad-bbb0-4d0f-bd11-14f142ef1767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hv2h9" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.047069 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2c2d1ec-c588-4247-aae2-c228404a38e0-config\") pod \"machine-api-operator-5694c8668f-mx65z\" (UID: \"d2c2d1ec-c588-4247-aae2-c228404a38e0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mx65z" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.047097 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.047120 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsvtw\" (UniqueName: \"kubernetes.io/projected/21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2-kube-api-access-tsvtw\") pod \"etcd-operator-b45778765-7cxs2\" (UID: \"21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7cxs2" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.047218 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5a4d8cc9-1731-4087-8e6f-ac0e184616fe-available-featuregates\") pod \"openshift-config-operator-7777fb866f-r9w8p\" (UID: \"5a4d8cc9-1731-4087-8e6f-ac0e184616fe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r9w8p" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.047244 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-oauth-serving-cert\") pod \"console-f9d7485db-62kxj\" (UID: \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\") " pod="openshift-console/console-f9d7485db-62kxj" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.047303 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2-serving-cert\") pod \"etcd-operator-b45778765-7cxs2\" (UID: \"21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7cxs2" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.047858 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-service-ca\") pod \"console-f9d7485db-62kxj\" (UID: \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\") " pod="openshift-console/console-f9d7485db-62kxj" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.049581 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.049840 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.051392 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2-serving-cert\") pod \"etcd-operator-b45778765-7cxs2\" (UID: \"21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7cxs2" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.051552 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.051621 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.052917 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/417dfff6-8c73-474a-8ec2-aadab4e32131-config\") pod \"console-operator-58897d9998-qfg52\" (UID: \"417dfff6-8c73-474a-8ec2-aadab4e32131\") " pod="openshift-console-operator/console-operator-58897d9998-qfg52" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.053255 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.055121 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5217fe99-8018-4db8-8f1d-2e0b33056638-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5x847\" (UID: \"5217fe99-8018-4db8-8f1d-2e0b33056638\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x847" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.055246 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2-etcd-client\") pod \"etcd-operator-b45778765-7cxs2\" (UID: \"21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7cxs2" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.055328 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/099f3158-3fcc-4d67-9ee2-1a9229e1ad23-audit-dir\") pod \"apiserver-7bbb656c7d-dgmgt\" (UID: \"099f3158-3fcc-4d67-9ee2-1a9229e1ad23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.055794 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/099f3158-3fcc-4d67-9ee2-1a9229e1ad23-encryption-config\") pod \"apiserver-7bbb656c7d-dgmgt\" (UID: \"099f3158-3fcc-4d67-9ee2-1a9229e1ad23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.056908 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/099f3158-3fcc-4d67-9ee2-1a9229e1ad23-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dgmgt\" (UID: \"099f3158-3fcc-4d67-9ee2-1a9229e1ad23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.057051 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5217fe99-8018-4db8-8f1d-2e0b33056638-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5x847\" (UID: \"5217fe99-8018-4db8-8f1d-2e0b33056638\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x847" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.057518 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/417dfff6-8c73-474a-8ec2-aadab4e32131-trusted-ca\") pod \"console-operator-58897d9998-qfg52\" (UID: \"417dfff6-8c73-474a-8ec2-aadab4e32131\") " pod="openshift-console-operator/console-operator-58897d9998-qfg52" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.057588 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/099f3158-3fcc-4d67-9ee2-1a9229e1ad23-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dgmgt\" (UID: \"099f3158-3fcc-4d67-9ee2-1a9229e1ad23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.057681 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fcf38656-7a15-4cd1-9038-83272327ce3c-audit-policies\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.057704 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b139dad-bbb0-4d0f-bd11-14f142ef1767-config\") pod \"controller-manager-879f6c89f-hv2h9\" (UID: \"6b139dad-bbb0-4d0f-bd11-14f142ef1767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hv2h9" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.045663 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b139dad-bbb0-4d0f-bd11-14f142ef1767-client-ca\") pod \"controller-manager-879f6c89f-hv2h9\" (UID: \"6b139dad-bbb0-4d0f-bd11-14f142ef1767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hv2h9" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.057606 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a4d8cc9-1731-4087-8e6f-ac0e184616fe-serving-cert\") pod \"openshift-config-operator-7777fb866f-r9w8p\" (UID: \"5a4d8cc9-1731-4087-8e6f-ac0e184616fe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r9w8p" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.057798 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2-config\") pod \"etcd-operator-b45778765-7cxs2\" (UID: \"21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7cxs2" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.058319 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fcf38656-7a15-4cd1-9038-83272327ce3c-audit-dir\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.058349 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2-etcd-ca\") pod \"etcd-operator-b45778765-7cxs2\" (UID: \"21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7cxs2" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.059635 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d2c2d1ec-c588-4247-aae2-c228404a38e0-images\") pod \"machine-api-operator-5694c8668f-mx65z\" (UID: \"d2c2d1ec-c588-4247-aae2-c228404a38e0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mx65z" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.059567 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.060336 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/099f3158-3fcc-4d67-9ee2-1a9229e1ad23-serving-cert\") pod \"apiserver-7bbb656c7d-dgmgt\" (UID: \"099f3158-3fcc-4d67-9ee2-1a9229e1ad23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.060388 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/46b256be-246d-43ad-99b9-7b67eca6762a-metrics-tls\") pod \"dns-operator-744455d44c-g42nq\" (UID: \"46b256be-246d-43ad-99b9-7b67eca6762a\") " pod="openshift-dns-operator/dns-operator-744455d44c-g42nq" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.060455 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-trusted-ca-bundle\") pod \"console-f9d7485db-62kxj\" (UID: \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\") " pod="openshift-console/console-f9d7485db-62kxj" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.060688 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6b139dad-bbb0-4d0f-bd11-14f142ef1767-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hv2h9\" (UID: \"6b139dad-bbb0-4d0f-bd11-14f142ef1767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hv2h9" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.060716 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.060988 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5a4d8cc9-1731-4087-8e6f-ac0e184616fe-available-featuregates\") pod \"openshift-config-operator-7777fb866f-r9w8p\" (UID: \"5a4d8cc9-1731-4087-8e6f-ac0e184616fe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r9w8p" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.061026 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.061327 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.061349 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d2c2d1ec-c588-4247-aae2-c228404a38e0-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mx65z\" (UID: \"d2c2d1ec-c588-4247-aae2-c228404a38e0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mx65z" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.061377 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/417dfff6-8c73-474a-8ec2-aadab4e32131-serving-cert\") pod \"console-operator-58897d9998-qfg52\" (UID: \"417dfff6-8c73-474a-8ec2-aadab4e32131\") " pod="openshift-console-operator/console-operator-58897d9998-qfg52" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.061745 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/099f3158-3fcc-4d67-9ee2-1a9229e1ad23-etcd-client\") pod \"apiserver-7bbb656c7d-dgmgt\" (UID: \"099f3158-3fcc-4d67-9ee2-1a9229e1ad23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.062181 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2-etcd-service-ca\") pod \"etcd-operator-b45778765-7cxs2\" (UID: \"21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7cxs2" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.062548 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2c2d1ec-c588-4247-aae2-c228404a38e0-config\") pod \"machine-api-operator-5694c8668f-mx65z\" (UID: \"d2c2d1ec-c588-4247-aae2-c228404a38e0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mx65z" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.063963 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-console-serving-cert\") pod \"console-f9d7485db-62kxj\" (UID: \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\") " pod="openshift-console/console-f9d7485db-62kxj" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.064102 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-console-config\") pod \"console-f9d7485db-62kxj\" (UID: \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\") " pod="openshift-console/console-f9d7485db-62kxj" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.064525 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.064525 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-oauth-serving-cert\") pod \"console-f9d7485db-62kxj\" (UID: \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\") " pod="openshift-console/console-f9d7485db-62kxj" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.064612 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-console-oauth-config\") pod \"console-f9d7485db-62kxj\" (UID: \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\") " pod="openshift-console/console-f9d7485db-62kxj" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.064714 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b139dad-bbb0-4d0f-bd11-14f142ef1767-serving-cert\") pod \"controller-manager-879f6c89f-hv2h9\" (UID: \"6b139dad-bbb0-4d0f-bd11-14f142ef1767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hv2h9" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.064836 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.069311 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.074742 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.075506 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec8148e4-5519-4b98-a17f-b80f1a44a4da-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-c6jk6\" (UID: \"ec8148e4-5519-4b98-a17f-b80f1a44a4da\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c6jk6" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.079767 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.098828 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.119233 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.139725 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.148343 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvnbs\" (UniqueName: \"kubernetes.io/projected/d8d9c52e-a041-4e4c-a364-ef09f105a206-kube-api-access-nvnbs\") pod \"control-plane-machine-set-operator-78cbb6b69f-287bb\" (UID: \"d8d9c52e-a041-4e4c-a364-ef09f105a206\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-287bb" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.148500 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8d9c52e-a041-4e4c-a364-ef09f105a206-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-287bb\" (UID: \"d8d9c52e-a041-4e4c-a364-ef09f105a206\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-287bb" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.158979 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.178831 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.199300 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.219340 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.239435 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.258745 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.281567 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.310960 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.318917 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.340135 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.359561 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.378817 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.405914 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.419414 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.439492 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.459004 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.479005 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.500036 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.526947 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.539237 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.558744 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.580296 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.598889 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.619333 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.639391 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.658779 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.679425 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.699999 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.719421 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.739888 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.759892 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.780079 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.799434 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.819382 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.859863 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.880183 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.899027 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.918867 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.937496 4802 request.go:700] Waited for 1.014560235s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/secrets?fieldSelector=metadata.name%3Dcontrol-plane-machine-set-operator-tls&limit=500&resourceVersion=0 Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.940574 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.954678 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8d9c52e-a041-4e4c-a364-ef09f105a206-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-287bb\" (UID: \"d8d9c52e-a041-4e4c-a364-ef09f105a206\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-287bb" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.960297 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 01 19:58:54 crc kubenswrapper[4802]: I1201 19:58:54.980904 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.000525 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.020144 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.039552 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.059521 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.078736 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.100378 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.119328 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.139157 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.158803 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.180125 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.200010 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.241379 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.259104 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.278390 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.299342 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.318729 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.339366 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.359666 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.379248 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.399258 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.418844 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.438906 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.458795 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.479086 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.499067 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.519713 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.539038 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.559483 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.578589 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.598802 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.618505 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.639260 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.658877 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.679642 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.698312 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.718780 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.739970 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.760091 4802 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.778657 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.799043 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.837898 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zllm9\" (UniqueName: \"kubernetes.io/projected/fcf38656-7a15-4cd1-9038-83272327ce3c-kube-api-access-zllm9\") pod \"oauth-openshift-558db77b4-nsqcw\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.855688 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh722\" (UniqueName: \"kubernetes.io/projected/099f3158-3fcc-4d67-9ee2-1a9229e1ad23-kube-api-access-nh722\") pod \"apiserver-7bbb656c7d-dgmgt\" (UID: \"099f3158-3fcc-4d67-9ee2-1a9229e1ad23\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.876644 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5217fe99-8018-4db8-8f1d-2e0b33056638-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5x847\" (UID: \"5217fe99-8018-4db8-8f1d-2e0b33056638\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x847" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.885816 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.893104 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6btwh\" (UniqueName: \"kubernetes.io/projected/1cb3789f-6490-4c1c-a5a8-54f3d9ea3bb0-kube-api-access-6btwh\") pod \"migrator-59844c95c7-nqwc4\" (UID: \"1cb3789f-6490-4c1c-a5a8-54f3d9ea3bb0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nqwc4" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.914313 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg7t5\" (UniqueName: \"kubernetes.io/projected/5a4d8cc9-1731-4087-8e6f-ac0e184616fe-kube-api-access-cg7t5\") pod \"openshift-config-operator-7777fb866f-r9w8p\" (UID: \"5a4d8cc9-1731-4087-8e6f-ac0e184616fe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r9w8p" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.932958 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r9w8p" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.934982 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-742m8\" (UniqueName: \"kubernetes.io/projected/ec8148e4-5519-4b98-a17f-b80f1a44a4da-kube-api-access-742m8\") pod \"cluster-samples-operator-665b6dd947-c6jk6\" (UID: \"ec8148e4-5519-4b98-a17f-b80f1a44a4da\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c6jk6" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.956734 4802 request.go:700] Waited for 1.900662662s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/serviceaccounts/machine-api-operator/token Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.962116 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv86v\" (UniqueName: \"kubernetes.io/projected/417dfff6-8c73-474a-8ec2-aadab4e32131-kube-api-access-wv86v\") pod \"console-operator-58897d9998-qfg52\" (UID: \"417dfff6-8c73-474a-8ec2-aadab4e32131\") " pod="openshift-console-operator/console-operator-58897d9998-qfg52" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.985730 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkmh7\" (UniqueName: \"kubernetes.io/projected/d2c2d1ec-c588-4247-aae2-c228404a38e0-kube-api-access-nkmh7\") pod \"machine-api-operator-5694c8668f-mx65z\" (UID: \"d2c2d1ec-c588-4247-aae2-c228404a38e0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mx65z" Dec 01 19:58:55 crc kubenswrapper[4802]: I1201 19:58:55.999353 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkqq9\" (UniqueName: \"kubernetes.io/projected/6af22b89-378d-4aab-a028-2e19ec6e8d1c-kube-api-access-pkqq9\") pod \"downloads-7954f5f757-gjlgw\" (UID: \"6af22b89-378d-4aab-a028-2e19ec6e8d1c\") " pod="openshift-console/downloads-7954f5f757-gjlgw" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.018867 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c6jk6" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.031089 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm622\" (UniqueName: \"kubernetes.io/projected/6b139dad-bbb0-4d0f-bd11-14f142ef1767-kube-api-access-gm622\") pod \"controller-manager-879f6c89f-hv2h9\" (UID: \"6b139dad-bbb0-4d0f-bd11-14f142ef1767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hv2h9" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.038526 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsvtw\" (UniqueName: \"kubernetes.io/projected/21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2-kube-api-access-tsvtw\") pod \"etcd-operator-b45778765-7cxs2\" (UID: \"21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7cxs2" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.042042 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-7cxs2" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.050423 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mx65z" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.055445 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-gjlgw" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.064216 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c852k\" (UniqueName: \"kubernetes.io/projected/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-kube-api-access-c852k\") pod \"console-f9d7485db-62kxj\" (UID: \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\") " pod="openshift-console/console-f9d7485db-62kxj" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.065165 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x847" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.073523 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mtnt\" (UniqueName: \"kubernetes.io/projected/46b256be-246d-43ad-99b9-7b67eca6762a-kube-api-access-4mtnt\") pod \"dns-operator-744455d44c-g42nq\" (UID: \"46b256be-246d-43ad-99b9-7b67eca6762a\") " pod="openshift-dns-operator/dns-operator-744455d44c-g42nq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.077328 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nqwc4" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.083569 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.094211 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvnbs\" (UniqueName: \"kubernetes.io/projected/d8d9c52e-a041-4e4c-a364-ef09f105a206-kube-api-access-nvnbs\") pod \"control-plane-machine-set-operator-78cbb6b69f-287bb\" (UID: \"d8d9c52e-a041-4e4c-a364-ef09f105a206\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-287bb" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.151999 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt"] Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.176540 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e98194d-958a-4c56-b5a3-90e01eab1816-metrics-certs\") pod \"router-default-5444994796-lvx7b\" (UID: \"7e98194d-958a-4c56-b5a3-90e01eab1816\") " pod="openshift-ingress/router-default-5444994796-lvx7b" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.176565 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/efce51d1-6c15-4dde-a0e3-d86fc67c2f0f-auth-proxy-config\") pod \"machine-approver-56656f9798-cdzf5\" (UID: \"efce51d1-6c15-4dde-a0e3-d86fc67c2f0f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cdzf5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.176612 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-registry-certificates\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.176628 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksdg5\" (UniqueName: \"kubernetes.io/projected/921a6a6c-bdb5-4e35-8428-17fae5f50192-kube-api-access-ksdg5\") pod \"service-ca-operator-777779d784-6bdr9\" (UID: \"921a6a6c-bdb5-4e35-8428-17fae5f50192\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6bdr9" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.176681 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bda97cb1-ea4c-499a-8549-dc62ae7e08c6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-jnwr8\" (UID: \"bda97cb1-ea4c-499a-8549-dc62ae7e08c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnwr8" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.176701 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.176720 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dc8de31-caf7-493a-a966-64105cf2e8fc-config\") pod \"authentication-operator-69f744f599-s8fb5\" (UID: \"4dc8de31-caf7-493a-a966-64105cf2e8fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s8fb5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.176736 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h42pm\" (UniqueName: \"kubernetes.io/projected/3dac3049-5e67-48ba-8584-be24cbcfdd36-kube-api-access-h42pm\") pod \"openshift-apiserver-operator-796bbdcf4f-h44gr\" (UID: \"3dac3049-5e67-48ba-8584-be24cbcfdd36\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h44gr" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.176754 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.176769 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a5bc347e-a789-4ce0-8800-2c322039d4a5-apiservice-cert\") pod \"packageserver-d55dfcdfc-bwjbd\" (UID: \"a5bc347e-a789-4ce0-8800-2c322039d4a5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bwjbd" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.176804 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0953c320-4dd8-4914-a84d-01bf5e9f11aa-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2pb67\" (UID: \"0953c320-4dd8-4914-a84d-01bf5e9f11aa\") " pod="openshift-marketplace/marketplace-operator-79b997595-2pb67" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.176838 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18c8b374-04ab-4b76-93d5-dd84b990a54e-config\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.176860 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/081dccc6-dbee-40a9-8333-d1c178e3fab3-secret-volume\") pod \"collect-profiles-29410305-db4dv\" (UID: \"081dccc6-dbee-40a9-8333-d1c178e3fab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410305-db4dv" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.176888 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed226f25-eec2-4393-961c-9c8d6011e8dc-config\") pod \"kube-apiserver-operator-766d6c64bb-s8dcm\" (UID: \"ed226f25-eec2-4393-961c-9c8d6011e8dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8dcm" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.176946 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efce51d1-6c15-4dde-a0e3-d86fc67c2f0f-config\") pod \"machine-approver-56656f9798-cdzf5\" (UID: \"efce51d1-6c15-4dde-a0e3-d86fc67c2f0f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cdzf5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.176960 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a7b1b1b8-14c6-4649-b791-1de21278aa35-trusted-ca\") pod \"ingress-operator-5b745b69d9-dn99n\" (UID: \"a7b1b1b8-14c6-4649-b791-1de21278aa35\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dn99n" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.176977 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a5bc347e-a789-4ce0-8800-2c322039d4a5-webhook-cert\") pod \"packageserver-d55dfcdfc-bwjbd\" (UID: \"a5bc347e-a789-4ce0-8800-2c322039d4a5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bwjbd" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.176993 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4c9f8f21-904c-465c-8acb-5e552719a02f-signing-key\") pod \"service-ca-9c57cc56f-6rtx7\" (UID: \"4c9f8f21-904c-465c-8acb-5e552719a02f\") " pod="openshift-service-ca/service-ca-9c57cc56f-6rtx7" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177015 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94zvr\" (UniqueName: \"kubernetes.io/projected/4dc8de31-caf7-493a-a966-64105cf2e8fc-kube-api-access-94zvr\") pod \"authentication-operator-69f744f599-s8fb5\" (UID: \"4dc8de31-caf7-493a-a966-64105cf2e8fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s8fb5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177031 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe0e9324-8a2b-499e-9fde-72e7f3effad9-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kbqcj\" (UID: \"fe0e9324-8a2b-499e-9fde-72e7f3effad9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kbqcj" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177045 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/18c8b374-04ab-4b76-93d5-dd84b990a54e-audit-dir\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177069 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e98194d-958a-4c56-b5a3-90e01eab1816-service-ca-bundle\") pod \"router-default-5444994796-lvx7b\" (UID: \"7e98194d-958a-4c56-b5a3-90e01eab1816\") " pod="openshift-ingress/router-default-5444994796-lvx7b" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177084 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/081dccc6-dbee-40a9-8333-d1c178e3fab3-config-volume\") pod \"collect-profiles-29410305-db4dv\" (UID: \"081dccc6-dbee-40a9-8333-d1c178e3fab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410305-db4dv" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177114 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6cp5\" (UniqueName: \"kubernetes.io/projected/081dccc6-dbee-40a9-8333-d1c178e3fab3-kube-api-access-x6cp5\") pod \"collect-profiles-29410305-db4dv\" (UID: \"081dccc6-dbee-40a9-8333-d1c178e3fab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410305-db4dv" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177128 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed226f25-eec2-4393-961c-9c8d6011e8dc-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-s8dcm\" (UID: \"ed226f25-eec2-4393-961c-9c8d6011e8dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8dcm" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177143 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed226f25-eec2-4393-961c-9c8d6011e8dc-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-s8dcm\" (UID: \"ed226f25-eec2-4393-961c-9c8d6011e8dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8dcm" Dec 01 19:58:56 crc kubenswrapper[4802]: E1201 19:58:56.177258 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:58:56.677238529 +0000 UTC m=+158.239798260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177388 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2rzw\" (UniqueName: \"kubernetes.io/projected/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-kube-api-access-r2rzw\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177408 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmv2r\" (UniqueName: \"kubernetes.io/projected/ad1c23af-b7e5-4d78-b9c0-272b55237564-kube-api-access-xmv2r\") pod \"multus-admission-controller-857f4d67dd-mn4k7\" (UID: \"ad1c23af-b7e5-4d78-b9c0-272b55237564\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mn4k7" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177437 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wr4l\" (UniqueName: \"kubernetes.io/projected/18c8b374-04ab-4b76-93d5-dd84b990a54e-kube-api-access-8wr4l\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177452 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/18c8b374-04ab-4b76-93d5-dd84b990a54e-node-pullsecrets\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177468 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bda97cb1-ea4c-499a-8549-dc62ae7e08c6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-jnwr8\" (UID: \"bda97cb1-ea4c-499a-8549-dc62ae7e08c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnwr8" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177491 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48p8f\" (UniqueName: \"kubernetes.io/projected/efce51d1-6c15-4dde-a0e3-d86fc67c2f0f-kube-api-access-48p8f\") pod \"machine-approver-56656f9798-cdzf5\" (UID: \"efce51d1-6c15-4dde-a0e3-d86fc67c2f0f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cdzf5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177509 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnccf\" (UniqueName: \"kubernetes.io/projected/2a619835-3b06-4cd8-8a68-87c6a1a997c5-kube-api-access-lnccf\") pod \"package-server-manager-789f6589d5-dml5n\" (UID: \"2a619835-3b06-4cd8-8a68-87c6a1a997c5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dml5n" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177523 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/18c8b374-04ab-4b76-93d5-dd84b990a54e-encryption-config\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177540 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-bound-sa-token\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177553 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7e98194d-958a-4c56-b5a3-90e01eab1816-default-certificate\") pod \"router-default-5444994796-lvx7b\" (UID: \"7e98194d-958a-4c56-b5a3-90e01eab1816\") " pod="openshift-ingress/router-default-5444994796-lvx7b" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177631 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/79289702-5b61-4c95-9ed7-371d48b3cd4d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-sdh6r\" (UID: \"79289702-5b61-4c95-9ed7-371d48b3cd4d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sdh6r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177650 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fm7c\" (UniqueName: \"kubernetes.io/projected/bda97cb1-ea4c-499a-8549-dc62ae7e08c6-kube-api-access-2fm7c\") pod \"cluster-image-registry-operator-dc59b4c8b-jnwr8\" (UID: \"bda97cb1-ea4c-499a-8549-dc62ae7e08c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnwr8" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177668 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z7bl\" (UniqueName: \"kubernetes.io/projected/b6da7bbe-5c7b-42a0-8d53-f726bf6c3d9c-kube-api-access-9z7bl\") pod \"kube-storage-version-migrator-operator-b67b599dd-pl7mh\" (UID: \"b6da7bbe-5c7b-42a0-8d53-f726bf6c3d9c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pl7mh" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177684 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-trusted-ca\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177699 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dc8de31-caf7-493a-a966-64105cf2e8fc-serving-cert\") pod \"authentication-operator-69f744f599-s8fb5\" (UID: \"4dc8de31-caf7-493a-a966-64105cf2e8fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s8fb5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177714 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ad1c23af-b7e5-4d78-b9c0-272b55237564-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mn4k7\" (UID: \"ad1c23af-b7e5-4d78-b9c0-272b55237564\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mn4k7" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177740 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76n79\" (UniqueName: \"kubernetes.io/projected/79289702-5b61-4c95-9ed7-371d48b3cd4d-kube-api-access-76n79\") pod \"olm-operator-6b444d44fb-sdh6r\" (UID: \"79289702-5b61-4c95-9ed7-371d48b3cd4d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sdh6r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177768 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dc8de31-caf7-493a-a966-64105cf2e8fc-service-ca-bundle\") pod \"authentication-operator-69f744f599-s8fb5\" (UID: \"4dc8de31-caf7-493a-a966-64105cf2e8fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s8fb5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177783 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18c8b374-04ab-4b76-93d5-dd84b990a54e-serving-cert\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177799 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dc8de31-caf7-493a-a966-64105cf2e8fc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-s8fb5\" (UID: \"4dc8de31-caf7-493a-a966-64105cf2e8fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s8fb5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177816 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04f115d4-aeb2-4fa4-b561-095eff3a18ea-proxy-tls\") pod \"machine-config-controller-84d6567774-w2f2s\" (UID: \"04f115d4-aeb2-4fa4-b561-095eff3a18ea\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w2f2s" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177831 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/18c8b374-04ab-4b76-93d5-dd84b990a54e-image-import-ca\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177857 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18c8b374-04ab-4b76-93d5-dd84b990a54e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177883 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-registry-tls\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177906 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht4rl\" (UniqueName: \"kubernetes.io/projected/7e98194d-958a-4c56-b5a3-90e01eab1816-kube-api-access-ht4rl\") pod \"router-default-5444994796-lvx7b\" (UID: \"7e98194d-958a-4c56-b5a3-90e01eab1816\") " pod="openshift-ingress/router-default-5444994796-lvx7b" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177961 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe0e9324-8a2b-499e-9fde-72e7f3effad9-config\") pod \"kube-controller-manager-operator-78b949d7b-kbqcj\" (UID: \"fe0e9324-8a2b-499e-9fde-72e7f3effad9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kbqcj" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177980 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6da7bbe-5c7b-42a0-8d53-f726bf6c3d9c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-pl7mh\" (UID: \"b6da7bbe-5c7b-42a0-8d53-f726bf6c3d9c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pl7mh" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.177995 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7b1b1b8-14c6-4649-b791-1de21278aa35-metrics-tls\") pod \"ingress-operator-5b745b69d9-dn99n\" (UID: \"a7b1b1b8-14c6-4649-b791-1de21278aa35\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dn99n" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.178021 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bda97cb1-ea4c-499a-8549-dc62ae7e08c6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-jnwr8\" (UID: \"bda97cb1-ea4c-499a-8549-dc62ae7e08c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnwr8" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.178037 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6da7bbe-5c7b-42a0-8d53-f726bf6c3d9c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-pl7mh\" (UID: \"b6da7bbe-5c7b-42a0-8d53-f726bf6c3d9c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pl7mh" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.178052 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dac3049-5e67-48ba-8584-be24cbcfdd36-config\") pod \"openshift-apiserver-operator-796bbdcf4f-h44gr\" (UID: \"3dac3049-5e67-48ba-8584-be24cbcfdd36\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h44gr" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.178096 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/921a6a6c-bdb5-4e35-8428-17fae5f50192-serving-cert\") pod \"service-ca-operator-777779d784-6bdr9\" (UID: \"921a6a6c-bdb5-4e35-8428-17fae5f50192\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6bdr9" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.178140 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/18c8b374-04ab-4b76-93d5-dd84b990a54e-etcd-client\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.178157 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0953c320-4dd8-4914-a84d-01bf5e9f11aa-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2pb67\" (UID: \"0953c320-4dd8-4914-a84d-01bf5e9f11aa\") " pod="openshift-marketplace/marketplace-operator-79b997595-2pb67" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.178182 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95mns\" (UniqueName: \"kubernetes.io/projected/4c9f8f21-904c-465c-8acb-5e552719a02f-kube-api-access-95mns\") pod \"service-ca-9c57cc56f-6rtx7\" (UID: \"4c9f8f21-904c-465c-8acb-5e552719a02f\") " pod="openshift-service-ca/service-ca-9c57cc56f-6rtx7" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.178209 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/18c8b374-04ab-4b76-93d5-dd84b990a54e-audit\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.178223 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a5bc347e-a789-4ce0-8800-2c322039d4a5-tmpfs\") pod \"packageserver-d55dfcdfc-bwjbd\" (UID: \"a5bc347e-a789-4ce0-8800-2c322039d4a5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bwjbd" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.178239 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/79289702-5b61-4c95-9ed7-371d48b3cd4d-srv-cert\") pod \"olm-operator-6b444d44fb-sdh6r\" (UID: \"79289702-5b61-4c95-9ed7-371d48b3cd4d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sdh6r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.178255 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/18c8b374-04ab-4b76-93d5-dd84b990a54e-etcd-serving-ca\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.178273 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk8bm\" (UniqueName: \"kubernetes.io/projected/0953c320-4dd8-4914-a84d-01bf5e9f11aa-kube-api-access-gk8bm\") pod \"marketplace-operator-79b997595-2pb67\" (UID: \"0953c320-4dd8-4914-a84d-01bf5e9f11aa\") " pod="openshift-marketplace/marketplace-operator-79b997595-2pb67" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.178289 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.178314 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/04f115d4-aeb2-4fa4-b561-095eff3a18ea-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-w2f2s\" (UID: \"04f115d4-aeb2-4fa4-b561-095eff3a18ea\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w2f2s" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.178330 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/921a6a6c-bdb5-4e35-8428-17fae5f50192-config\") pod \"service-ca-operator-777779d784-6bdr9\" (UID: \"921a6a6c-bdb5-4e35-8428-17fae5f50192\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6bdr9" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.178344 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a7b1b1b8-14c6-4649-b791-1de21278aa35-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dn99n\" (UID: \"a7b1b1b8-14c6-4649-b791-1de21278aa35\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dn99n" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.178419 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n4gt\" (UniqueName: \"kubernetes.io/projected/a7b1b1b8-14c6-4649-b791-1de21278aa35-kube-api-access-4n4gt\") pod \"ingress-operator-5b745b69d9-dn99n\" (UID: \"a7b1b1b8-14c6-4649-b791-1de21278aa35\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dn99n" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.178471 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe0e9324-8a2b-499e-9fde-72e7f3effad9-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kbqcj\" (UID: \"fe0e9324-8a2b-499e-9fde-72e7f3effad9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kbqcj" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.178499 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/efce51d1-6c15-4dde-a0e3-d86fc67c2f0f-machine-approver-tls\") pod \"machine-approver-56656f9798-cdzf5\" (UID: \"efce51d1-6c15-4dde-a0e3-d86fc67c2f0f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cdzf5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.178522 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4c9f8f21-904c-465c-8acb-5e552719a02f-signing-cabundle\") pod \"service-ca-9c57cc56f-6rtx7\" (UID: \"4c9f8f21-904c-465c-8acb-5e552719a02f\") " pod="openshift-service-ca/service-ca-9c57cc56f-6rtx7" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.178550 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6g25\" (UniqueName: \"kubernetes.io/projected/04f115d4-aeb2-4fa4-b561-095eff3a18ea-kube-api-access-g6g25\") pod \"machine-config-controller-84d6567774-w2f2s\" (UID: \"04f115d4-aeb2-4fa4-b561-095eff3a18ea\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w2f2s" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.178576 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dac3049-5e67-48ba-8584-be24cbcfdd36-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-h44gr\" (UID: \"3dac3049-5e67-48ba-8584-be24cbcfdd36\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h44gr" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.178601 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7e98194d-958a-4c56-b5a3-90e01eab1816-stats-auth\") pod \"router-default-5444994796-lvx7b\" (UID: \"7e98194d-958a-4c56-b5a3-90e01eab1816\") " pod="openshift-ingress/router-default-5444994796-lvx7b" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.178632 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6rkh\" (UniqueName: \"kubernetes.io/projected/a5bc347e-a789-4ce0-8800-2c322039d4a5-kube-api-access-g6rkh\") pod \"packageserver-d55dfcdfc-bwjbd\" (UID: \"a5bc347e-a789-4ce0-8800-2c322039d4a5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bwjbd" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.178656 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a619835-3b06-4cd8-8a68-87c6a1a997c5-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dml5n\" (UID: \"2a619835-3b06-4cd8-8a68-87c6a1a997c5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dml5n" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.181931 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-r9w8p"] Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.203536 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hv2h9" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.238889 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c6jk6"] Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.249874 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-62kxj" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.251429 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-287bb" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.258489 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-qfg52" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.280848 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.280977 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bda97cb1-ea4c-499a-8549-dc62ae7e08c6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-jnwr8\" (UID: \"bda97cb1-ea4c-499a-8549-dc62ae7e08c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnwr8" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.280998 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dc8de31-caf7-493a-a966-64105cf2e8fc-config\") pod \"authentication-operator-69f744f599-s8fb5\" (UID: \"4dc8de31-caf7-493a-a966-64105cf2e8fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s8fb5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.281016 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h42pm\" (UniqueName: \"kubernetes.io/projected/3dac3049-5e67-48ba-8584-be24cbcfdd36-kube-api-access-h42pm\") pod \"openshift-apiserver-operator-796bbdcf4f-h44gr\" (UID: \"3dac3049-5e67-48ba-8584-be24cbcfdd36\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h44gr" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.281037 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/94fd9d5d-ddb0-4b19-83c1-d8df802aec70-csi-data-dir\") pod \"csi-hostpathplugin-hqg8s\" (UID: \"94fd9d5d-ddb0-4b19-83c1-d8df802aec70\") " pod="hostpath-provisioner/csi-hostpathplugin-hqg8s" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.281064 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a5bc347e-a789-4ce0-8800-2c322039d4a5-apiservice-cert\") pod \"packageserver-d55dfcdfc-bwjbd\" (UID: \"a5bc347e-a789-4ce0-8800-2c322039d4a5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bwjbd" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.281080 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.281096 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0953c320-4dd8-4914-a84d-01bf5e9f11aa-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2pb67\" (UID: \"0953c320-4dd8-4914-a84d-01bf5e9f11aa\") " pod="openshift-marketplace/marketplace-operator-79b997595-2pb67" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.281113 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a686218d-cec1-409d-a394-a563b438fbf0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5bdsk\" (UID: \"a686218d-cec1-409d-a394-a563b438fbf0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5bdsk" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.281132 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18c8b374-04ab-4b76-93d5-dd84b990a54e-config\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.281150 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctrj4\" (UniqueName: \"kubernetes.io/projected/56c261dd-49c4-4f69-9400-7a012e281b7b-kube-api-access-ctrj4\") pod \"route-controller-manager-6576b87f9c-qr5sm\" (UID: \"56c261dd-49c4-4f69-9400-7a012e281b7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.281169 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/081dccc6-dbee-40a9-8333-d1c178e3fab3-secret-volume\") pod \"collect-profiles-29410305-db4dv\" (UID: \"081dccc6-dbee-40a9-8333-d1c178e3fab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410305-db4dv" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.281188 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed226f25-eec2-4393-961c-9c8d6011e8dc-config\") pod \"kube-apiserver-operator-766d6c64bb-s8dcm\" (UID: \"ed226f25-eec2-4393-961c-9c8d6011e8dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8dcm" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.281225 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8mbf\" (UniqueName: \"kubernetes.io/projected/4bdd38ce-c886-49bc-887c-b2936750731c-kube-api-access-k8mbf\") pod \"catalog-operator-68c6474976-cvht4\" (UID: \"4bdd38ce-c886-49bc-887c-b2936750731c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cvht4" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.281252 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efce51d1-6c15-4dde-a0e3-d86fc67c2f0f-config\") pod \"machine-approver-56656f9798-cdzf5\" (UID: \"efce51d1-6c15-4dde-a0e3-d86fc67c2f0f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cdzf5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.281268 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a7b1b1b8-14c6-4649-b791-1de21278aa35-trusted-ca\") pod \"ingress-operator-5b745b69d9-dn99n\" (UID: \"a7b1b1b8-14c6-4649-b791-1de21278aa35\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dn99n" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.281293 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a5bc347e-a789-4ce0-8800-2c322039d4a5-webhook-cert\") pod \"packageserver-d55dfcdfc-bwjbd\" (UID: \"a5bc347e-a789-4ce0-8800-2c322039d4a5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bwjbd" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.281309 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4c9f8f21-904c-465c-8acb-5e552719a02f-signing-key\") pod \"service-ca-9c57cc56f-6rtx7\" (UID: \"4c9f8f21-904c-465c-8acb-5e552719a02f\") " pod="openshift-service-ca/service-ca-9c57cc56f-6rtx7" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.281325 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94zvr\" (UniqueName: \"kubernetes.io/projected/4dc8de31-caf7-493a-a966-64105cf2e8fc-kube-api-access-94zvr\") pod \"authentication-operator-69f744f599-s8fb5\" (UID: \"4dc8de31-caf7-493a-a966-64105cf2e8fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s8fb5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.281351 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe0e9324-8a2b-499e-9fde-72e7f3effad9-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kbqcj\" (UID: \"fe0e9324-8a2b-499e-9fde-72e7f3effad9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kbqcj" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.281368 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/18c8b374-04ab-4b76-93d5-dd84b990a54e-audit-dir\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.281384 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/94fd9d5d-ddb0-4b19-83c1-d8df802aec70-mountpoint-dir\") pod \"csi-hostpathplugin-hqg8s\" (UID: \"94fd9d5d-ddb0-4b19-83c1-d8df802aec70\") " pod="hostpath-provisioner/csi-hostpathplugin-hqg8s" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.281402 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e98194d-958a-4c56-b5a3-90e01eab1816-service-ca-bundle\") pod \"router-default-5444994796-lvx7b\" (UID: \"7e98194d-958a-4c56-b5a3-90e01eab1816\") " pod="openshift-ingress/router-default-5444994796-lvx7b" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.281419 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6cp5\" (UniqueName: \"kubernetes.io/projected/081dccc6-dbee-40a9-8333-d1c178e3fab3-kube-api-access-x6cp5\") pod \"collect-profiles-29410305-db4dv\" (UID: \"081dccc6-dbee-40a9-8333-d1c178e3fab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410305-db4dv" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.281436 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed226f25-eec2-4393-961c-9c8d6011e8dc-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-s8dcm\" (UID: \"ed226f25-eec2-4393-961c-9c8d6011e8dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8dcm" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.281453 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsvqt\" (UniqueName: \"kubernetes.io/projected/8b7d6d96-d0c1-4280-9f00-041935714120-kube-api-access-gsvqt\") pod \"machine-config-server-t4kb6\" (UID: \"8b7d6d96-d0c1-4280-9f00-041935714120\") " pod="openshift-machine-config-operator/machine-config-server-t4kb6" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.281471 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/081dccc6-dbee-40a9-8333-d1c178e3fab3-config-volume\") pod \"collect-profiles-29410305-db4dv\" (UID: \"081dccc6-dbee-40a9-8333-d1c178e3fab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410305-db4dv" Dec 01 19:58:56 crc kubenswrapper[4802]: E1201 19:58:56.281753 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:58:56.781735605 +0000 UTC m=+158.344295246 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.282623 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dc8de31-caf7-493a-a966-64105cf2e8fc-config\") pod \"authentication-operator-69f744f599-s8fb5\" (UID: \"4dc8de31-caf7-493a-a966-64105cf2e8fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s8fb5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.283208 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed226f25-eec2-4393-961c-9c8d6011e8dc-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-s8dcm\" (UID: \"ed226f25-eec2-4393-961c-9c8d6011e8dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8dcm" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.283256 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a686218d-cec1-409d-a394-a563b438fbf0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5bdsk\" (UID: \"a686218d-cec1-409d-a394-a563b438fbf0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5bdsk" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.283288 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2rzw\" (UniqueName: \"kubernetes.io/projected/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-kube-api-access-r2rzw\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.283306 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmv2r\" (UniqueName: \"kubernetes.io/projected/ad1c23af-b7e5-4d78-b9c0-272b55237564-kube-api-access-xmv2r\") pod \"multus-admission-controller-857f4d67dd-mn4k7\" (UID: \"ad1c23af-b7e5-4d78-b9c0-272b55237564\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mn4k7" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.283330 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wr4l\" (UniqueName: \"kubernetes.io/projected/18c8b374-04ab-4b76-93d5-dd84b990a54e-kube-api-access-8wr4l\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.283353 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/18c8b374-04ab-4b76-93d5-dd84b990a54e-node-pullsecrets\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.283501 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.284122 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/081dccc6-dbee-40a9-8333-d1c178e3fab3-config-volume\") pod \"collect-profiles-29410305-db4dv\" (UID: \"081dccc6-dbee-40a9-8333-d1c178e3fab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410305-db4dv" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.284287 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bda97cb1-ea4c-499a-8549-dc62ae7e08c6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-jnwr8\" (UID: \"bda97cb1-ea4c-499a-8549-dc62ae7e08c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnwr8" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.284329 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48p8f\" (UniqueName: \"kubernetes.io/projected/efce51d1-6c15-4dde-a0e3-d86fc67c2f0f-kube-api-access-48p8f\") pod \"machine-approver-56656f9798-cdzf5\" (UID: \"efce51d1-6c15-4dde-a0e3-d86fc67c2f0f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cdzf5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.284357 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnccf\" (UniqueName: \"kubernetes.io/projected/2a619835-3b06-4cd8-8a68-87c6a1a997c5-kube-api-access-lnccf\") pod \"package-server-manager-789f6589d5-dml5n\" (UID: \"2a619835-3b06-4cd8-8a68-87c6a1a997c5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dml5n" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.284374 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/18c8b374-04ab-4b76-93d5-dd84b990a54e-encryption-config\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.284393 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/43dc6cdf-8ced-494c-8237-75ff4d23caba-proxy-tls\") pod \"machine-config-operator-74547568cd-bjmdk\" (UID: \"43dc6cdf-8ced-494c-8237-75ff4d23caba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjmdk" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.284410 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/94fd9d5d-ddb0-4b19-83c1-d8df802aec70-socket-dir\") pod \"csi-hostpathplugin-hqg8s\" (UID: \"94fd9d5d-ddb0-4b19-83c1-d8df802aec70\") " pod="hostpath-provisioner/csi-hostpathplugin-hqg8s" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.284431 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-bound-sa-token\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.284448 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fm7c\" (UniqueName: \"kubernetes.io/projected/bda97cb1-ea4c-499a-8549-dc62ae7e08c6-kube-api-access-2fm7c\") pod \"cluster-image-registry-operator-dc59b4c8b-jnwr8\" (UID: \"bda97cb1-ea4c-499a-8549-dc62ae7e08c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnwr8" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.284466 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z7bl\" (UniqueName: \"kubernetes.io/projected/b6da7bbe-5c7b-42a0-8d53-f726bf6c3d9c-kube-api-access-9z7bl\") pod \"kube-storage-version-migrator-operator-b67b599dd-pl7mh\" (UID: \"b6da7bbe-5c7b-42a0-8d53-f726bf6c3d9c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pl7mh" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.284485 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk6bk\" (UniqueName: \"kubernetes.io/projected/6d40e8d8-8182-431a-94c5-a27d77b773e1-kube-api-access-qk6bk\") pod \"ingress-canary-dtz7g\" (UID: \"6d40e8d8-8182-431a-94c5-a27d77b773e1\") " pod="openshift-ingress-canary/ingress-canary-dtz7g" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.284504 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7e98194d-958a-4c56-b5a3-90e01eab1816-default-certificate\") pod \"router-default-5444994796-lvx7b\" (UID: \"7e98194d-958a-4c56-b5a3-90e01eab1816\") " pod="openshift-ingress/router-default-5444994796-lvx7b" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.284521 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/79289702-5b61-4c95-9ed7-371d48b3cd4d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-sdh6r\" (UID: \"79289702-5b61-4c95-9ed7-371d48b3cd4d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sdh6r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.284537 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dc8de31-caf7-493a-a966-64105cf2e8fc-serving-cert\") pod \"authentication-operator-69f744f599-s8fb5\" (UID: \"4dc8de31-caf7-493a-a966-64105cf2e8fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s8fb5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.284559 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ad1c23af-b7e5-4d78-b9c0-272b55237564-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mn4k7\" (UID: \"ad1c23af-b7e5-4d78-b9c0-272b55237564\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mn4k7" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.284575 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-trusted-ca\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.284592 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76n79\" (UniqueName: \"kubernetes.io/projected/79289702-5b61-4c95-9ed7-371d48b3cd4d-kube-api-access-76n79\") pod \"olm-operator-6b444d44fb-sdh6r\" (UID: \"79289702-5b61-4c95-9ed7-371d48b3cd4d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sdh6r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.284610 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56c261dd-49c4-4f69-9400-7a012e281b7b-config\") pod \"route-controller-manager-6576b87f9c-qr5sm\" (UID: \"56c261dd-49c4-4f69-9400-7a012e281b7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.284635 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dc8de31-caf7-493a-a966-64105cf2e8fc-service-ca-bundle\") pod \"authentication-operator-69f744f599-s8fb5\" (UID: \"4dc8de31-caf7-493a-a966-64105cf2e8fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s8fb5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.286474 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/18c8b374-04ab-4b76-93d5-dd84b990a54e-audit-dir\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.288906 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed226f25-eec2-4393-961c-9c8d6011e8dc-config\") pod \"kube-apiserver-operator-766d6c64bb-s8dcm\" (UID: \"ed226f25-eec2-4393-961c-9c8d6011e8dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8dcm" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.289293 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18c8b374-04ab-4b76-93d5-dd84b990a54e-serving-cert\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291039 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04f115d4-aeb2-4fa4-b561-095eff3a18ea-proxy-tls\") pod \"machine-config-controller-84d6567774-w2f2s\" (UID: \"04f115d4-aeb2-4fa4-b561-095eff3a18ea\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w2f2s" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291063 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/18c8b374-04ab-4b76-93d5-dd84b990a54e-image-import-ca\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291086 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dc8de31-caf7-493a-a966-64105cf2e8fc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-s8fb5\" (UID: \"4dc8de31-caf7-493a-a966-64105cf2e8fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s8fb5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291109 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18c8b374-04ab-4b76-93d5-dd84b990a54e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291153 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-registry-tls\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291179 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/56c261dd-49c4-4f69-9400-7a012e281b7b-client-ca\") pod \"route-controller-manager-6576b87f9c-qr5sm\" (UID: \"56c261dd-49c4-4f69-9400-7a012e281b7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291214 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/94fd9d5d-ddb0-4b19-83c1-d8df802aec70-plugins-dir\") pod \"csi-hostpathplugin-hqg8s\" (UID: \"94fd9d5d-ddb0-4b19-83c1-d8df802aec70\") " pod="hostpath-provisioner/csi-hostpathplugin-hqg8s" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291238 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht4rl\" (UniqueName: \"kubernetes.io/projected/7e98194d-958a-4c56-b5a3-90e01eab1816-kube-api-access-ht4rl\") pod \"router-default-5444994796-lvx7b\" (UID: \"7e98194d-958a-4c56-b5a3-90e01eab1816\") " pod="openshift-ingress/router-default-5444994796-lvx7b" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291260 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe0e9324-8a2b-499e-9fde-72e7f3effad9-config\") pod \"kube-controller-manager-operator-78b949d7b-kbqcj\" (UID: \"fe0e9324-8a2b-499e-9fde-72e7f3effad9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kbqcj" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291278 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6da7bbe-5c7b-42a0-8d53-f726bf6c3d9c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-pl7mh\" (UID: \"b6da7bbe-5c7b-42a0-8d53-f726bf6c3d9c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pl7mh" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291297 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7b1b1b8-14c6-4649-b791-1de21278aa35-metrics-tls\") pod \"ingress-operator-5b745b69d9-dn99n\" (UID: \"a7b1b1b8-14c6-4649-b791-1de21278aa35\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dn99n" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291332 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8b7d6d96-d0c1-4280-9f00-041935714120-certs\") pod \"machine-config-server-t4kb6\" (UID: \"8b7d6d96-d0c1-4280-9f00-041935714120\") " pod="openshift-machine-config-operator/machine-config-server-t4kb6" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291352 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bda97cb1-ea4c-499a-8549-dc62ae7e08c6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-jnwr8\" (UID: \"bda97cb1-ea4c-499a-8549-dc62ae7e08c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnwr8" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291369 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6da7bbe-5c7b-42a0-8d53-f726bf6c3d9c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-pl7mh\" (UID: \"b6da7bbe-5c7b-42a0-8d53-f726bf6c3d9c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pl7mh" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291390 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dac3049-5e67-48ba-8584-be24cbcfdd36-config\") pod \"openshift-apiserver-operator-796bbdcf4f-h44gr\" (UID: \"3dac3049-5e67-48ba-8584-be24cbcfdd36\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h44gr" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291412 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4bdd38ce-c886-49bc-887c-b2936750731c-srv-cert\") pod \"catalog-operator-68c6474976-cvht4\" (UID: \"4bdd38ce-c886-49bc-887c-b2936750731c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cvht4" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291458 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/921a6a6c-bdb5-4e35-8428-17fae5f50192-serving-cert\") pod \"service-ca-operator-777779d784-6bdr9\" (UID: \"921a6a6c-bdb5-4e35-8428-17fae5f50192\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6bdr9" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291504 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q857k\" (UniqueName: \"kubernetes.io/projected/ff03dfdf-cc89-44be-80ec-33df2a4c006b-kube-api-access-q857k\") pod \"dns-default-g7fvq\" (UID: \"ff03dfdf-cc89-44be-80ec-33df2a4c006b\") " pod="openshift-dns/dns-default-g7fvq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291525 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a7b1b1b8-14c6-4649-b791-1de21278aa35-trusted-ca\") pod \"ingress-operator-5b745b69d9-dn99n\" (UID: \"a7b1b1b8-14c6-4649-b791-1de21278aa35\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dn99n" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291531 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/18c8b374-04ab-4b76-93d5-dd84b990a54e-etcd-client\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291590 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0953c320-4dd8-4914-a84d-01bf5e9f11aa-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2pb67\" (UID: \"0953c320-4dd8-4914-a84d-01bf5e9f11aa\") " pod="openshift-marketplace/marketplace-operator-79b997595-2pb67" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291618 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff03dfdf-cc89-44be-80ec-33df2a4c006b-metrics-tls\") pod \"dns-default-g7fvq\" (UID: \"ff03dfdf-cc89-44be-80ec-33df2a4c006b\") " pod="openshift-dns/dns-default-g7fvq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291653 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95mns\" (UniqueName: \"kubernetes.io/projected/4c9f8f21-904c-465c-8acb-5e552719a02f-kube-api-access-95mns\") pod \"service-ca-9c57cc56f-6rtx7\" (UID: \"4c9f8f21-904c-465c-8acb-5e552719a02f\") " pod="openshift-service-ca/service-ca-9c57cc56f-6rtx7" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291677 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/18c8b374-04ab-4b76-93d5-dd84b990a54e-audit\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291714 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a5bc347e-a789-4ce0-8800-2c322039d4a5-tmpfs\") pod \"packageserver-d55dfcdfc-bwjbd\" (UID: \"a5bc347e-a789-4ce0-8800-2c322039d4a5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bwjbd" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291736 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/79289702-5b61-4c95-9ed7-371d48b3cd4d-srv-cert\") pod \"olm-operator-6b444d44fb-sdh6r\" (UID: \"79289702-5b61-4c95-9ed7-371d48b3cd4d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sdh6r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291759 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/18c8b374-04ab-4b76-93d5-dd84b990a54e-etcd-serving-ca\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291789 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk8bm\" (UniqueName: \"kubernetes.io/projected/0953c320-4dd8-4914-a84d-01bf5e9f11aa-kube-api-access-gk8bm\") pod \"marketplace-operator-79b997595-2pb67\" (UID: \"0953c320-4dd8-4914-a84d-01bf5e9f11aa\") " pod="openshift-marketplace/marketplace-operator-79b997595-2pb67" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291813 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/43dc6cdf-8ced-494c-8237-75ff4d23caba-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bjmdk\" (UID: \"43dc6cdf-8ced-494c-8237-75ff4d23caba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjmdk" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291835 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq7wb\" (UniqueName: \"kubernetes.io/projected/43dc6cdf-8ced-494c-8237-75ff4d23caba-kube-api-access-hq7wb\") pod \"machine-config-operator-74547568cd-bjmdk\" (UID: \"43dc6cdf-8ced-494c-8237-75ff4d23caba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjmdk" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291859 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqnfr\" (UniqueName: \"kubernetes.io/projected/a686218d-cec1-409d-a394-a563b438fbf0-kube-api-access-pqnfr\") pod \"openshift-controller-manager-operator-756b6f6bc6-5bdsk\" (UID: \"a686218d-cec1-409d-a394-a563b438fbf0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5bdsk" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291889 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291931 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d40e8d8-8182-431a-94c5-a27d77b773e1-cert\") pod \"ingress-canary-dtz7g\" (UID: \"6d40e8d8-8182-431a-94c5-a27d77b773e1\") " pod="openshift-ingress-canary/ingress-canary-dtz7g" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291958 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/04f115d4-aeb2-4fa4-b561-095eff3a18ea-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-w2f2s\" (UID: \"04f115d4-aeb2-4fa4-b561-095eff3a18ea\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w2f2s" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.291981 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/921a6a6c-bdb5-4e35-8428-17fae5f50192-config\") pod \"service-ca-operator-777779d784-6bdr9\" (UID: \"921a6a6c-bdb5-4e35-8428-17fae5f50192\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6bdr9" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.292003 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a7b1b1b8-14c6-4649-b791-1de21278aa35-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dn99n\" (UID: \"a7b1b1b8-14c6-4649-b791-1de21278aa35\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dn99n" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.292029 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n4gt\" (UniqueName: \"kubernetes.io/projected/a7b1b1b8-14c6-4649-b791-1de21278aa35-kube-api-access-4n4gt\") pod \"ingress-operator-5b745b69d9-dn99n\" (UID: \"a7b1b1b8-14c6-4649-b791-1de21278aa35\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dn99n" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.292051 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4bdd38ce-c886-49bc-887c-b2936750731c-profile-collector-cert\") pod \"catalog-operator-68c6474976-cvht4\" (UID: \"4bdd38ce-c886-49bc-887c-b2936750731c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cvht4" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.292081 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5crxz\" (UniqueName: \"kubernetes.io/projected/94fd9d5d-ddb0-4b19-83c1-d8df802aec70-kube-api-access-5crxz\") pod \"csi-hostpathplugin-hqg8s\" (UID: \"94fd9d5d-ddb0-4b19-83c1-d8df802aec70\") " pod="hostpath-provisioner/csi-hostpathplugin-hqg8s" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.292103 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe0e9324-8a2b-499e-9fde-72e7f3effad9-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kbqcj\" (UID: \"fe0e9324-8a2b-499e-9fde-72e7f3effad9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kbqcj" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.292125 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/efce51d1-6c15-4dde-a0e3-d86fc67c2f0f-machine-approver-tls\") pod \"machine-approver-56656f9798-cdzf5\" (UID: \"efce51d1-6c15-4dde-a0e3-d86fc67c2f0f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cdzf5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.292122 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe0e9324-8a2b-499e-9fde-72e7f3effad9-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kbqcj\" (UID: \"fe0e9324-8a2b-499e-9fde-72e7f3effad9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kbqcj" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.292147 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dac3049-5e67-48ba-8584-be24cbcfdd36-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-h44gr\" (UID: \"3dac3049-5e67-48ba-8584-be24cbcfdd36\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h44gr" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.292168 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ff03dfdf-cc89-44be-80ec-33df2a4c006b-config-volume\") pod \"dns-default-g7fvq\" (UID: \"ff03dfdf-cc89-44be-80ec-33df2a4c006b\") " pod="openshift-dns/dns-default-g7fvq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.292190 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8b7d6d96-d0c1-4280-9f00-041935714120-node-bootstrap-token\") pod \"machine-config-server-t4kb6\" (UID: \"8b7d6d96-d0c1-4280-9f00-041935714120\") " pod="openshift-machine-config-operator/machine-config-server-t4kb6" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.292238 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4c9f8f21-904c-465c-8acb-5e552719a02f-signing-cabundle\") pod \"service-ca-9c57cc56f-6rtx7\" (UID: \"4c9f8f21-904c-465c-8acb-5e552719a02f\") " pod="openshift-service-ca/service-ca-9c57cc56f-6rtx7" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.290901 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-trusted-ca\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.292262 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6g25\" (UniqueName: \"kubernetes.io/projected/04f115d4-aeb2-4fa4-b561-095eff3a18ea-kube-api-access-g6g25\") pod \"machine-config-controller-84d6567774-w2f2s\" (UID: \"04f115d4-aeb2-4fa4-b561-095eff3a18ea\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w2f2s" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.292316 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6rkh\" (UniqueName: \"kubernetes.io/projected/a5bc347e-a789-4ce0-8800-2c322039d4a5-kube-api-access-g6rkh\") pod \"packageserver-d55dfcdfc-bwjbd\" (UID: \"a5bc347e-a789-4ce0-8800-2c322039d4a5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bwjbd" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.292344 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a619835-3b06-4cd8-8a68-87c6a1a997c5-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dml5n\" (UID: \"2a619835-3b06-4cd8-8a68-87c6a1a997c5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dml5n" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.292370 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a5bc347e-a789-4ce0-8800-2c322039d4a5-apiservice-cert\") pod \"packageserver-d55dfcdfc-bwjbd\" (UID: \"a5bc347e-a789-4ce0-8800-2c322039d4a5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bwjbd" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.292378 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/43dc6cdf-8ced-494c-8237-75ff4d23caba-images\") pod \"machine-config-operator-74547568cd-bjmdk\" (UID: \"43dc6cdf-8ced-494c-8237-75ff4d23caba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjmdk" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.292414 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7e98194d-958a-4c56-b5a3-90e01eab1816-stats-auth\") pod \"router-default-5444994796-lvx7b\" (UID: \"7e98194d-958a-4c56-b5a3-90e01eab1816\") " pod="openshift-ingress/router-default-5444994796-lvx7b" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.289398 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18c8b374-04ab-4b76-93d5-dd84b990a54e-config\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.289618 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/18c8b374-04ab-4b76-93d5-dd84b990a54e-node-pullsecrets\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.290310 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e98194d-958a-4c56-b5a3-90e01eab1816-service-ca-bundle\") pod \"router-default-5444994796-lvx7b\" (UID: \"7e98194d-958a-4c56-b5a3-90e01eab1816\") " pod="openshift-ingress/router-default-5444994796-lvx7b" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.293623 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dc8de31-caf7-493a-a966-64105cf2e8fc-service-ca-bundle\") pod \"authentication-operator-69f744f599-s8fb5\" (UID: \"4dc8de31-caf7-493a-a966-64105cf2e8fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s8fb5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.293671 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0953c320-4dd8-4914-a84d-01bf5e9f11aa-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2pb67\" (UID: \"0953c320-4dd8-4914-a84d-01bf5e9f11aa\") " pod="openshift-marketplace/marketplace-operator-79b997595-2pb67" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.294063 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6da7bbe-5c7b-42a0-8d53-f726bf6c3d9c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-pl7mh\" (UID: \"b6da7bbe-5c7b-42a0-8d53-f726bf6c3d9c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pl7mh" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.294101 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe0e9324-8a2b-499e-9fde-72e7f3effad9-config\") pod \"kube-controller-manager-operator-78b949d7b-kbqcj\" (UID: \"fe0e9324-8a2b-499e-9fde-72e7f3effad9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kbqcj" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.294499 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dac3049-5e67-48ba-8584-be24cbcfdd36-config\") pod \"openshift-apiserver-operator-796bbdcf4f-h44gr\" (UID: \"3dac3049-5e67-48ba-8584-be24cbcfdd36\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h44gr" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.294718 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dc8de31-caf7-493a-a966-64105cf2e8fc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-s8fb5\" (UID: \"4dc8de31-caf7-493a-a966-64105cf2e8fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s8fb5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.294948 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/18c8b374-04ab-4b76-93d5-dd84b990a54e-audit\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.295639 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bda97cb1-ea4c-499a-8549-dc62ae7e08c6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-jnwr8\" (UID: \"bda97cb1-ea4c-499a-8549-dc62ae7e08c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnwr8" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.295767 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18c8b374-04ab-4b76-93d5-dd84b990a54e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.295888 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/04f115d4-aeb2-4fa4-b561-095eff3a18ea-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-w2f2s\" (UID: \"04f115d4-aeb2-4fa4-b561-095eff3a18ea\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w2f2s" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.296586 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/921a6a6c-bdb5-4e35-8428-17fae5f50192-config\") pod \"service-ca-operator-777779d784-6bdr9\" (UID: \"921a6a6c-bdb5-4e35-8428-17fae5f50192\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6bdr9" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.297319 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/081dccc6-dbee-40a9-8333-d1c178e3fab3-secret-volume\") pod \"collect-profiles-29410305-db4dv\" (UID: \"081dccc6-dbee-40a9-8333-d1c178e3fab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410305-db4dv" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.298377 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e98194d-958a-4c56-b5a3-90e01eab1816-metrics-certs\") pod \"router-default-5444994796-lvx7b\" (UID: \"7e98194d-958a-4c56-b5a3-90e01eab1816\") " pod="openshift-ingress/router-default-5444994796-lvx7b" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.298416 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/efce51d1-6c15-4dde-a0e3-d86fc67c2f0f-auth-proxy-config\") pod \"machine-approver-56656f9798-cdzf5\" (UID: \"efce51d1-6c15-4dde-a0e3-d86fc67c2f0f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cdzf5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.298412 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a5bc347e-a789-4ce0-8800-2c322039d4a5-tmpfs\") pod \"packageserver-d55dfcdfc-bwjbd\" (UID: \"a5bc347e-a789-4ce0-8800-2c322039d4a5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bwjbd" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.298447 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/94fd9d5d-ddb0-4b19-83c1-d8df802aec70-registration-dir\") pod \"csi-hostpathplugin-hqg8s\" (UID: \"94fd9d5d-ddb0-4b19-83c1-d8df802aec70\") " pod="hostpath-provisioner/csi-hostpathplugin-hqg8s" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.299425 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-registry-certificates\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.299449 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksdg5\" (UniqueName: \"kubernetes.io/projected/921a6a6c-bdb5-4e35-8428-17fae5f50192-kube-api-access-ksdg5\") pod \"service-ca-operator-777779d784-6bdr9\" (UID: \"921a6a6c-bdb5-4e35-8428-17fae5f50192\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6bdr9" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.299521 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56c261dd-49c4-4f69-9400-7a012e281b7b-serving-cert\") pod \"route-controller-manager-6576b87f9c-qr5sm\" (UID: \"56c261dd-49c4-4f69-9400-7a012e281b7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.300676 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-registry-tls\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.301283 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a5bc347e-a789-4ce0-8800-2c322039d4a5-webhook-cert\") pod \"packageserver-d55dfcdfc-bwjbd\" (UID: \"a5bc347e-a789-4ce0-8800-2c322039d4a5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bwjbd" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.301495 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efce51d1-6c15-4dde-a0e3-d86fc67c2f0f-config\") pod \"machine-approver-56656f9798-cdzf5\" (UID: \"efce51d1-6c15-4dde-a0e3-d86fc67c2f0f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cdzf5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.300314 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4c9f8f21-904c-465c-8acb-5e552719a02f-signing-cabundle\") pod \"service-ca-9c57cc56f-6rtx7\" (UID: \"4c9f8f21-904c-465c-8acb-5e552719a02f\") " pod="openshift-service-ca/service-ca-9c57cc56f-6rtx7" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.301924 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-registry-certificates\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.302177 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7e98194d-958a-4c56-b5a3-90e01eab1816-default-certificate\") pod \"router-default-5444994796-lvx7b\" (UID: \"7e98194d-958a-4c56-b5a3-90e01eab1816\") " pod="openshift-ingress/router-default-5444994796-lvx7b" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.302633 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ad1c23af-b7e5-4d78-b9c0-272b55237564-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mn4k7\" (UID: \"ad1c23af-b7e5-4d78-b9c0-272b55237564\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mn4k7" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.302879 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e98194d-958a-4c56-b5a3-90e01eab1816-metrics-certs\") pod \"router-default-5444994796-lvx7b\" (UID: \"7e98194d-958a-4c56-b5a3-90e01eab1816\") " pod="openshift-ingress/router-default-5444994796-lvx7b" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.303321 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.303402 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/79289702-5b61-4c95-9ed7-371d48b3cd4d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-sdh6r\" (UID: \"79289702-5b61-4c95-9ed7-371d48b3cd4d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sdh6r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.303549 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0953c320-4dd8-4914-a84d-01bf5e9f11aa-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2pb67\" (UID: \"0953c320-4dd8-4914-a84d-01bf5e9f11aa\") " pod="openshift-marketplace/marketplace-operator-79b997595-2pb67" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.303774 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dac3049-5e67-48ba-8584-be24cbcfdd36-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-h44gr\" (UID: \"3dac3049-5e67-48ba-8584-be24cbcfdd36\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h44gr" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.304143 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18c8b374-04ab-4b76-93d5-dd84b990a54e-serving-cert\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.304273 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed226f25-eec2-4393-961c-9c8d6011e8dc-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-s8dcm\" (UID: \"ed226f25-eec2-4393-961c-9c8d6011e8dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8dcm" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.304381 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/921a6a6c-bdb5-4e35-8428-17fae5f50192-serving-cert\") pod \"service-ca-operator-777779d784-6bdr9\" (UID: \"921a6a6c-bdb5-4e35-8428-17fae5f50192\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6bdr9" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.304568 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/18c8b374-04ab-4b76-93d5-dd84b990a54e-etcd-client\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.304657 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bda97cb1-ea4c-499a-8549-dc62ae7e08c6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-jnwr8\" (UID: \"bda97cb1-ea4c-499a-8549-dc62ae7e08c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnwr8" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.304669 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/18c8b374-04ab-4b76-93d5-dd84b990a54e-etcd-serving-ca\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.304876 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/18c8b374-04ab-4b76-93d5-dd84b990a54e-image-import-ca\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.304905 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7e98194d-958a-4c56-b5a3-90e01eab1816-stats-auth\") pod \"router-default-5444994796-lvx7b\" (UID: \"7e98194d-958a-4c56-b5a3-90e01eab1816\") " pod="openshift-ingress/router-default-5444994796-lvx7b" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.305111 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/efce51d1-6c15-4dde-a0e3-d86fc67c2f0f-auth-proxy-config\") pod \"machine-approver-56656f9798-cdzf5\" (UID: \"efce51d1-6c15-4dde-a0e3-d86fc67c2f0f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cdzf5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.305461 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/18c8b374-04ab-4b76-93d5-dd84b990a54e-encryption-config\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.305600 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a619835-3b06-4cd8-8a68-87c6a1a997c5-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dml5n\" (UID: \"2a619835-3b06-4cd8-8a68-87c6a1a997c5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dml5n" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.308583 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6da7bbe-5c7b-42a0-8d53-f726bf6c3d9c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-pl7mh\" (UID: \"b6da7bbe-5c7b-42a0-8d53-f726bf6c3d9c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pl7mh" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.309249 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4c9f8f21-904c-465c-8acb-5e552719a02f-signing-key\") pod \"service-ca-9c57cc56f-6rtx7\" (UID: \"4c9f8f21-904c-465c-8acb-5e552719a02f\") " pod="openshift-service-ca/service-ca-9c57cc56f-6rtx7" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.309436 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/79289702-5b61-4c95-9ed7-371d48b3cd4d-srv-cert\") pod \"olm-operator-6b444d44fb-sdh6r\" (UID: \"79289702-5b61-4c95-9ed7-371d48b3cd4d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sdh6r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.312659 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04f115d4-aeb2-4fa4-b561-095eff3a18ea-proxy-tls\") pod \"machine-config-controller-84d6567774-w2f2s\" (UID: \"04f115d4-aeb2-4fa4-b561-095eff3a18ea\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w2f2s" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.313127 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dc8de31-caf7-493a-a966-64105cf2e8fc-serving-cert\") pod \"authentication-operator-69f744f599-s8fb5\" (UID: \"4dc8de31-caf7-493a-a966-64105cf2e8fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s8fb5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.313826 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/efce51d1-6c15-4dde-a0e3-d86fc67c2f0f-machine-approver-tls\") pod \"machine-approver-56656f9798-cdzf5\" (UID: \"efce51d1-6c15-4dde-a0e3-d86fc67c2f0f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cdzf5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.314027 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7b1b1b8-14c6-4649-b791-1de21278aa35-metrics-tls\") pod \"ingress-operator-5b745b69d9-dn99n\" (UID: \"a7b1b1b8-14c6-4649-b791-1de21278aa35\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dn99n" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.318300 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x847"] Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.320965 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bda97cb1-ea4c-499a-8549-dc62ae7e08c6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-jnwr8\" (UID: \"bda97cb1-ea4c-499a-8549-dc62ae7e08c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnwr8" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.334728 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-g42nq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.338269 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h42pm\" (UniqueName: \"kubernetes.io/projected/3dac3049-5e67-48ba-8584-be24cbcfdd36-kube-api-access-h42pm\") pod \"openshift-apiserver-operator-796bbdcf4f-h44gr\" (UID: \"3dac3049-5e67-48ba-8584-be24cbcfdd36\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h44gr" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.355740 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94zvr\" (UniqueName: \"kubernetes.io/projected/4dc8de31-caf7-493a-a966-64105cf2e8fc-kube-api-access-94zvr\") pod \"authentication-operator-69f744f599-s8fb5\" (UID: \"4dc8de31-caf7-493a-a966-64105cf2e8fc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-s8fb5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.376502 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2rzw\" (UniqueName: \"kubernetes.io/projected/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-kube-api-access-r2rzw\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.398311 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmv2r\" (UniqueName: \"kubernetes.io/projected/ad1c23af-b7e5-4d78-b9c0-272b55237564-kube-api-access-xmv2r\") pod \"multus-admission-controller-857f4d67dd-mn4k7\" (UID: \"ad1c23af-b7e5-4d78-b9c0-272b55237564\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mn4k7" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.402374 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctrj4\" (UniqueName: \"kubernetes.io/projected/56c261dd-49c4-4f69-9400-7a012e281b7b-kube-api-access-ctrj4\") pod \"route-controller-manager-6576b87f9c-qr5sm\" (UID: \"56c261dd-49c4-4f69-9400-7a012e281b7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.402423 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8mbf\" (UniqueName: \"kubernetes.io/projected/4bdd38ce-c886-49bc-887c-b2936750731c-kube-api-access-k8mbf\") pod \"catalog-operator-68c6474976-cvht4\" (UID: \"4bdd38ce-c886-49bc-887c-b2936750731c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cvht4" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.402454 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/94fd9d5d-ddb0-4b19-83c1-d8df802aec70-mountpoint-dir\") pod \"csi-hostpathplugin-hqg8s\" (UID: \"94fd9d5d-ddb0-4b19-83c1-d8df802aec70\") " pod="hostpath-provisioner/csi-hostpathplugin-hqg8s" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.402493 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsvqt\" (UniqueName: \"kubernetes.io/projected/8b7d6d96-d0c1-4280-9f00-041935714120-kube-api-access-gsvqt\") pod \"machine-config-server-t4kb6\" (UID: \"8b7d6d96-d0c1-4280-9f00-041935714120\") " pod="openshift-machine-config-operator/machine-config-server-t4kb6" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.402512 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a686218d-cec1-409d-a394-a563b438fbf0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5bdsk\" (UID: \"a686218d-cec1-409d-a394-a563b438fbf0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5bdsk" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.402555 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/43dc6cdf-8ced-494c-8237-75ff4d23caba-proxy-tls\") pod \"machine-config-operator-74547568cd-bjmdk\" (UID: \"43dc6cdf-8ced-494c-8237-75ff4d23caba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjmdk" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.402569 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/94fd9d5d-ddb0-4b19-83c1-d8df802aec70-socket-dir\") pod \"csi-hostpathplugin-hqg8s\" (UID: \"94fd9d5d-ddb0-4b19-83c1-d8df802aec70\") " pod="hostpath-provisioner/csi-hostpathplugin-hqg8s" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.402601 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk6bk\" (UniqueName: \"kubernetes.io/projected/6d40e8d8-8182-431a-94c5-a27d77b773e1-kube-api-access-qk6bk\") pod \"ingress-canary-dtz7g\" (UID: \"6d40e8d8-8182-431a-94c5-a27d77b773e1\") " pod="openshift-ingress-canary/ingress-canary-dtz7g" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.402628 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56c261dd-49c4-4f69-9400-7a012e281b7b-config\") pod \"route-controller-manager-6576b87f9c-qr5sm\" (UID: \"56c261dd-49c4-4f69-9400-7a012e281b7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.402674 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/56c261dd-49c4-4f69-9400-7a012e281b7b-client-ca\") pod \"route-controller-manager-6576b87f9c-qr5sm\" (UID: \"56c261dd-49c4-4f69-9400-7a012e281b7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.402688 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/94fd9d5d-ddb0-4b19-83c1-d8df802aec70-plugins-dir\") pod \"csi-hostpathplugin-hqg8s\" (UID: \"94fd9d5d-ddb0-4b19-83c1-d8df802aec70\") " pod="hostpath-provisioner/csi-hostpathplugin-hqg8s" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.402715 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8b7d6d96-d0c1-4280-9f00-041935714120-certs\") pod \"machine-config-server-t4kb6\" (UID: \"8b7d6d96-d0c1-4280-9f00-041935714120\") " pod="openshift-machine-config-operator/machine-config-server-t4kb6" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.402732 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4bdd38ce-c886-49bc-887c-b2936750731c-srv-cert\") pod \"catalog-operator-68c6474976-cvht4\" (UID: \"4bdd38ce-c886-49bc-887c-b2936750731c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cvht4" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.402751 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q857k\" (UniqueName: \"kubernetes.io/projected/ff03dfdf-cc89-44be-80ec-33df2a4c006b-kube-api-access-q857k\") pod \"dns-default-g7fvq\" (UID: \"ff03dfdf-cc89-44be-80ec-33df2a4c006b\") " pod="openshift-dns/dns-default-g7fvq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.402769 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff03dfdf-cc89-44be-80ec-33df2a4c006b-metrics-tls\") pod \"dns-default-g7fvq\" (UID: \"ff03dfdf-cc89-44be-80ec-33df2a4c006b\") " pod="openshift-dns/dns-default-g7fvq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.402807 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/43dc6cdf-8ced-494c-8237-75ff4d23caba-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bjmdk\" (UID: \"43dc6cdf-8ced-494c-8237-75ff4d23caba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjmdk" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.402825 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq7wb\" (UniqueName: \"kubernetes.io/projected/43dc6cdf-8ced-494c-8237-75ff4d23caba-kube-api-access-hq7wb\") pod \"machine-config-operator-74547568cd-bjmdk\" (UID: \"43dc6cdf-8ced-494c-8237-75ff4d23caba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjmdk" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.402841 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqnfr\" (UniqueName: \"kubernetes.io/projected/a686218d-cec1-409d-a394-a563b438fbf0-kube-api-access-pqnfr\") pod \"openshift-controller-manager-operator-756b6f6bc6-5bdsk\" (UID: \"a686218d-cec1-409d-a394-a563b438fbf0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5bdsk" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.402859 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d40e8d8-8182-431a-94c5-a27d77b773e1-cert\") pod \"ingress-canary-dtz7g\" (UID: \"6d40e8d8-8182-431a-94c5-a27d77b773e1\") " pod="openshift-ingress-canary/ingress-canary-dtz7g" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.402888 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4bdd38ce-c886-49bc-887c-b2936750731c-profile-collector-cert\") pod \"catalog-operator-68c6474976-cvht4\" (UID: \"4bdd38ce-c886-49bc-887c-b2936750731c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cvht4" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.402905 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5crxz\" (UniqueName: \"kubernetes.io/projected/94fd9d5d-ddb0-4b19-83c1-d8df802aec70-kube-api-access-5crxz\") pod \"csi-hostpathplugin-hqg8s\" (UID: \"94fd9d5d-ddb0-4b19-83c1-d8df802aec70\") " pod="hostpath-provisioner/csi-hostpathplugin-hqg8s" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.402932 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ff03dfdf-cc89-44be-80ec-33df2a4c006b-config-volume\") pod \"dns-default-g7fvq\" (UID: \"ff03dfdf-cc89-44be-80ec-33df2a4c006b\") " pod="openshift-dns/dns-default-g7fvq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.402949 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8b7d6d96-d0c1-4280-9f00-041935714120-node-bootstrap-token\") pod \"machine-config-server-t4kb6\" (UID: \"8b7d6d96-d0c1-4280-9f00-041935714120\") " pod="openshift-machine-config-operator/machine-config-server-t4kb6" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.402971 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/43dc6cdf-8ced-494c-8237-75ff4d23caba-images\") pod \"machine-config-operator-74547568cd-bjmdk\" (UID: \"43dc6cdf-8ced-494c-8237-75ff4d23caba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjmdk" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.402989 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/94fd9d5d-ddb0-4b19-83c1-d8df802aec70-registration-dir\") pod \"csi-hostpathplugin-hqg8s\" (UID: \"94fd9d5d-ddb0-4b19-83c1-d8df802aec70\") " pod="hostpath-provisioner/csi-hostpathplugin-hqg8s" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.403011 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56c261dd-49c4-4f69-9400-7a012e281b7b-serving-cert\") pod \"route-controller-manager-6576b87f9c-qr5sm\" (UID: \"56c261dd-49c4-4f69-9400-7a012e281b7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.404614 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.404684 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/94fd9d5d-ddb0-4b19-83c1-d8df802aec70-csi-data-dir\") pod \"csi-hostpathplugin-hqg8s\" (UID: \"94fd9d5d-ddb0-4b19-83c1-d8df802aec70\") " pod="hostpath-provisioner/csi-hostpathplugin-hqg8s" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.404725 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a686218d-cec1-409d-a394-a563b438fbf0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5bdsk\" (UID: \"a686218d-cec1-409d-a394-a563b438fbf0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5bdsk" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.405625 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a686218d-cec1-409d-a394-a563b438fbf0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5bdsk\" (UID: \"a686218d-cec1-409d-a394-a563b438fbf0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5bdsk" Dec 01 19:58:56 crc kubenswrapper[4802]: E1201 19:58:56.405929 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:58:56.90591195 +0000 UTC m=+158.468471591 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.406465 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/94fd9d5d-ddb0-4b19-83c1-d8df802aec70-csi-data-dir\") pod \"csi-hostpathplugin-hqg8s\" (UID: \"94fd9d5d-ddb0-4b19-83c1-d8df802aec70\") " pod="hostpath-provisioner/csi-hostpathplugin-hqg8s" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.409455 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/43dc6cdf-8ced-494c-8237-75ff4d23caba-images\") pod \"machine-config-operator-74547568cd-bjmdk\" (UID: \"43dc6cdf-8ced-494c-8237-75ff4d23caba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjmdk" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.409823 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nsqcw"] Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.409955 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/43dc6cdf-8ced-494c-8237-75ff4d23caba-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bjmdk\" (UID: \"43dc6cdf-8ced-494c-8237-75ff4d23caba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjmdk" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.409962 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ff03dfdf-cc89-44be-80ec-33df2a4c006b-config-volume\") pod \"dns-default-g7fvq\" (UID: \"ff03dfdf-cc89-44be-80ec-33df2a4c006b\") " pod="openshift-dns/dns-default-g7fvq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.410095 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/94fd9d5d-ddb0-4b19-83c1-d8df802aec70-registration-dir\") pod \"csi-hostpathplugin-hqg8s\" (UID: \"94fd9d5d-ddb0-4b19-83c1-d8df802aec70\") " pod="hostpath-provisioner/csi-hostpathplugin-hqg8s" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.410115 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/94fd9d5d-ddb0-4b19-83c1-d8df802aec70-socket-dir\") pod \"csi-hostpathplugin-hqg8s\" (UID: \"94fd9d5d-ddb0-4b19-83c1-d8df802aec70\") " pod="hostpath-provisioner/csi-hostpathplugin-hqg8s" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.410149 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/94fd9d5d-ddb0-4b19-83c1-d8df802aec70-plugins-dir\") pod \"csi-hostpathplugin-hqg8s\" (UID: \"94fd9d5d-ddb0-4b19-83c1-d8df802aec70\") " pod="hostpath-provisioner/csi-hostpathplugin-hqg8s" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.411581 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4bdd38ce-c886-49bc-887c-b2936750731c-profile-collector-cert\") pod \"catalog-operator-68c6474976-cvht4\" (UID: \"4bdd38ce-c886-49bc-887c-b2936750731c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cvht4" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.411626 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56c261dd-49c4-4f69-9400-7a012e281b7b-config\") pod \"route-controller-manager-6576b87f9c-qr5sm\" (UID: \"56c261dd-49c4-4f69-9400-7a012e281b7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.411832 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/94fd9d5d-ddb0-4b19-83c1-d8df802aec70-mountpoint-dir\") pod \"csi-hostpathplugin-hqg8s\" (UID: \"94fd9d5d-ddb0-4b19-83c1-d8df802aec70\") " pod="hostpath-provisioner/csi-hostpathplugin-hqg8s" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.412001 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4bdd38ce-c886-49bc-887c-b2936750731c-srv-cert\") pod \"catalog-operator-68c6474976-cvht4\" (UID: \"4bdd38ce-c886-49bc-887c-b2936750731c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cvht4" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.412125 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/43dc6cdf-8ced-494c-8237-75ff4d23caba-proxy-tls\") pod \"machine-config-operator-74547568cd-bjmdk\" (UID: \"43dc6cdf-8ced-494c-8237-75ff4d23caba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjmdk" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.412144 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/56c261dd-49c4-4f69-9400-7a012e281b7b-client-ca\") pod \"route-controller-manager-6576b87f9c-qr5sm\" (UID: \"56c261dd-49c4-4f69-9400-7a012e281b7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.412531 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff03dfdf-cc89-44be-80ec-33df2a4c006b-metrics-tls\") pod \"dns-default-g7fvq\" (UID: \"ff03dfdf-cc89-44be-80ec-33df2a4c006b\") " pod="openshift-dns/dns-default-g7fvq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.412842 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56c261dd-49c4-4f69-9400-7a012e281b7b-serving-cert\") pod \"route-controller-manager-6576b87f9c-qr5sm\" (UID: \"56c261dd-49c4-4f69-9400-7a012e281b7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.416372 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8b7d6d96-d0c1-4280-9f00-041935714120-node-bootstrap-token\") pod \"machine-config-server-t4kb6\" (UID: \"8b7d6d96-d0c1-4280-9f00-041935714120\") " pod="openshift-machine-config-operator/machine-config-server-t4kb6" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.417055 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8b7d6d96-d0c1-4280-9f00-041935714120-certs\") pod \"machine-config-server-t4kb6\" (UID: \"8b7d6d96-d0c1-4280-9f00-041935714120\") " pod="openshift-machine-config-operator/machine-config-server-t4kb6" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.418038 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h44gr" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.418722 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a686218d-cec1-409d-a394-a563b438fbf0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5bdsk\" (UID: \"a686218d-cec1-409d-a394-a563b438fbf0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5bdsk" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.418930 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d40e8d8-8182-431a-94c5-a27d77b773e1-cert\") pod \"ingress-canary-dtz7g\" (UID: \"6d40e8d8-8182-431a-94c5-a27d77b773e1\") " pod="openshift-ingress-canary/ingress-canary-dtz7g" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.421940 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6cp5\" (UniqueName: \"kubernetes.io/projected/081dccc6-dbee-40a9-8333-d1c178e3fab3-kube-api-access-x6cp5\") pod \"collect-profiles-29410305-db4dv\" (UID: \"081dccc6-dbee-40a9-8333-d1c178e3fab3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410305-db4dv" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.427073 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-s8fb5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.432998 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z7bl\" (UniqueName: \"kubernetes.io/projected/b6da7bbe-5c7b-42a0-8d53-f726bf6c3d9c-kube-api-access-9z7bl\") pod \"kube-storage-version-migrator-operator-b67b599dd-pl7mh\" (UID: \"b6da7bbe-5c7b-42a0-8d53-f726bf6c3d9c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pl7mh" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.452738 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wr4l\" (UniqueName: \"kubernetes.io/projected/18c8b374-04ab-4b76-93d5-dd84b990a54e-kube-api-access-8wr4l\") pod \"apiserver-76f77b778f-4czjq\" (UID: \"18c8b374-04ab-4b76-93d5-dd84b990a54e\") " pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.463433 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hv2h9"] Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.477691 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-bound-sa-token\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.493006 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fm7c\" (UniqueName: \"kubernetes.io/projected/bda97cb1-ea4c-499a-8549-dc62ae7e08c6-kube-api-access-2fm7c\") pod \"cluster-image-registry-operator-dc59b4c8b-jnwr8\" (UID: \"bda97cb1-ea4c-499a-8549-dc62ae7e08c6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnwr8" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.498491 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnwr8" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.506110 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:58:56 crc kubenswrapper[4802]: E1201 19:58:56.506397 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:58:57.00636505 +0000 UTC m=+158.568924691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.506638 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:56 crc kubenswrapper[4802]: E1201 19:58:56.507118 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:58:57.007103544 +0000 UTC m=+158.569663185 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.512898 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x847" event={"ID":"5217fe99-8018-4db8-8f1d-2e0b33056638","Type":"ContainerStarted","Data":"b0acc89aed05d20ea41941b0837bb8052f900589683f4d7ef81cb3bccf8dfd50"} Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.514796 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" event={"ID":"099f3158-3fcc-4d67-9ee2-1a9229e1ad23","Type":"ContainerStarted","Data":"af89a79cd48862ba96ce233b3cd1b728c6669b27360774b924b5ad88eae2153a"} Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.516050 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-gjlgw"] Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.521084 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed226f25-eec2-4393-961c-9c8d6011e8dc-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-s8dcm\" (UID: \"ed226f25-eec2-4393-961c-9c8d6011e8dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8dcm" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.521855 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7cxs2"] Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.522881 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r9w8p" event={"ID":"5a4d8cc9-1731-4087-8e6f-ac0e184616fe","Type":"ContainerStarted","Data":"78be285a15aadf8f2a9aca4049ecec9ac23d0bb0d5e509ab000effbb68685993"} Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.522937 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-62kxj"] Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.534166 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48p8f\" (UniqueName: \"kubernetes.io/projected/efce51d1-6c15-4dde-a0e3-d86fc67c2f0f-kube-api-access-48p8f\") pod \"machine-approver-56656f9798-cdzf5\" (UID: \"efce51d1-6c15-4dde-a0e3-d86fc67c2f0f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cdzf5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.546115 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mx65z"] Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.551694 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnccf\" (UniqueName: \"kubernetes.io/projected/2a619835-3b06-4cd8-8a68-87c6a1a997c5-kube-api-access-lnccf\") pod \"package-server-manager-789f6589d5-dml5n\" (UID: \"2a619835-3b06-4cd8-8a68-87c6a1a997c5\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dml5n" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.558601 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qfg52"] Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.567679 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-287bb"] Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.567779 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mn4k7" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.576911 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht4rl\" (UniqueName: \"kubernetes.io/projected/7e98194d-958a-4c56-b5a3-90e01eab1816-kube-api-access-ht4rl\") pod \"router-default-5444994796-lvx7b\" (UID: \"7e98194d-958a-4c56-b5a3-90e01eab1816\") " pod="openshift-ingress/router-default-5444994796-lvx7b" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.577004 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410305-db4dv" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.578570 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-g42nq"] Dec 01 19:58:56 crc kubenswrapper[4802]: W1201 19:58:56.581563 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcf38656_7a15_4cd1_9038_83272327ce3c.slice/crio-33d7f8738f9f3ca5cd99f028801fa9a18310a1a2043316cf5360f850c2491ca9 WatchSource:0}: Error finding container 33d7f8738f9f3ca5cd99f028801fa9a18310a1a2043316cf5360f850c2491ca9: Status 404 returned error can't find the container with id 33d7f8738f9f3ca5cd99f028801fa9a18310a1a2043316cf5360f850c2491ca9 Dec 01 19:58:56 crc kubenswrapper[4802]: W1201 19:58:56.584306 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b139dad_bbb0_4d0f_bd11_14f142ef1767.slice/crio-f1d4bb3293dee0da48beccae98e4ccad27878e2d9e5ca330618b95a1cab82856 WatchSource:0}: Error finding container f1d4bb3293dee0da48beccae98e4ccad27878e2d9e5ca330618b95a1cab82856: Status 404 returned error can't find the container with id f1d4bb3293dee0da48beccae98e4ccad27878e2d9e5ca330618b95a1cab82856 Dec 01 19:58:56 crc kubenswrapper[4802]: W1201 19:58:56.589990 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6af22b89_378d_4aab_a028_2e19ec6e8d1c.slice/crio-6ab21c489b622a2d0e87f6e0c85d6b1d269057f18eb11cd682d4f9da32c8c770 WatchSource:0}: Error finding container 6ab21c489b622a2d0e87f6e0c85d6b1d269057f18eb11cd682d4f9da32c8c770: Status 404 returned error can't find the container with id 6ab21c489b622a2d0e87f6e0c85d6b1d269057f18eb11cd682d4f9da32c8c770 Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.599128 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6g25\" (UniqueName: \"kubernetes.io/projected/04f115d4-aeb2-4fa4-b561-095eff3a18ea-kube-api-access-g6g25\") pod \"machine-config-controller-84d6567774-w2f2s\" (UID: \"04f115d4-aeb2-4fa4-b561-095eff3a18ea\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w2f2s" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.602899 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nqwc4"] Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.609480 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:58:56 crc kubenswrapper[4802]: E1201 19:58:56.610080 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:58:57.110053661 +0000 UTC m=+158.672613302 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.610379 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:56 crc kubenswrapper[4802]: E1201 19:58:56.610901 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:58:57.110883966 +0000 UTC m=+158.673443607 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.617323 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6rkh\" (UniqueName: \"kubernetes.io/projected/a5bc347e-a789-4ce0-8800-2c322039d4a5-kube-api-access-g6rkh\") pod \"packageserver-d55dfcdfc-bwjbd\" (UID: \"a5bc347e-a789-4ce0-8800-2c322039d4a5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bwjbd" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.634296 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76n79\" (UniqueName: \"kubernetes.io/projected/79289702-5b61-4c95-9ed7-371d48b3cd4d-kube-api-access-76n79\") pod \"olm-operator-6b444d44fb-sdh6r\" (UID: \"79289702-5b61-4c95-9ed7-371d48b3cd4d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sdh6r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.680495 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95mns\" (UniqueName: \"kubernetes.io/projected/4c9f8f21-904c-465c-8acb-5e552719a02f-kube-api-access-95mns\") pod \"service-ca-9c57cc56f-6rtx7\" (UID: \"4c9f8f21-904c-465c-8acb-5e552719a02f\") " pod="openshift-service-ca/service-ca-9c57cc56f-6rtx7" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.690300 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pl7mh" Dec 01 19:58:56 crc kubenswrapper[4802]: W1201 19:58:56.694059 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cb3789f_6490_4c1c_a5a8_54f3d9ea3bb0.slice/crio-ca931cd35e54699cce25254c173bc2327a2bfa63f0d88fdb47b1148c841b9c76 WatchSource:0}: Error finding container ca931cd35e54699cce25254c173bc2327a2bfa63f0d88fdb47b1148c841b9c76: Status 404 returned error can't find the container with id ca931cd35e54699cce25254c173bc2327a2bfa63f0d88fdb47b1148c841b9c76 Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.694437 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk8bm\" (UniqueName: \"kubernetes.io/projected/0953c320-4dd8-4914-a84d-01bf5e9f11aa-kube-api-access-gk8bm\") pod \"marketplace-operator-79b997595-2pb67\" (UID: \"0953c320-4dd8-4914-a84d-01bf5e9f11aa\") " pod="openshift-marketplace/marketplace-operator-79b997595-2pb67" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.698439 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8dcm" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.710234 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cdzf5" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.711223 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:58:56 crc kubenswrapper[4802]: E1201 19:58:56.711607 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:58:57.211591935 +0000 UTC m=+158.774151576 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.718436 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a7b1b1b8-14c6-4649-b791-1de21278aa35-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dn99n\" (UID: \"a7b1b1b8-14c6-4649-b791-1de21278aa35\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dn99n" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.734632 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.741242 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n4gt\" (UniqueName: \"kubernetes.io/projected/a7b1b1b8-14c6-4649-b791-1de21278aa35-kube-api-access-4n4gt\") pod \"ingress-operator-5b745b69d9-dn99n\" (UID: \"a7b1b1b8-14c6-4649-b791-1de21278aa35\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dn99n" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.741517 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dn99n" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.754655 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe0e9324-8a2b-499e-9fde-72e7f3effad9-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kbqcj\" (UID: \"fe0e9324-8a2b-499e-9fde-72e7f3effad9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kbqcj" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.778752 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksdg5\" (UniqueName: \"kubernetes.io/projected/921a6a6c-bdb5-4e35-8428-17fae5f50192-kube-api-access-ksdg5\") pod \"service-ca-operator-777779d784-6bdr9\" (UID: \"921a6a6c-bdb5-4e35-8428-17fae5f50192\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-6bdr9" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.794392 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q857k\" (UniqueName: \"kubernetes.io/projected/ff03dfdf-cc89-44be-80ec-33df2a4c006b-kube-api-access-q857k\") pod \"dns-default-g7fvq\" (UID: \"ff03dfdf-cc89-44be-80ec-33df2a4c006b\") " pod="openshift-dns/dns-default-g7fvq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.806635 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-lvx7b" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.812484 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:56 crc kubenswrapper[4802]: E1201 19:58:56.812922 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:58:57.312908992 +0000 UTC m=+158.875468623 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.813993 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5crxz\" (UniqueName: \"kubernetes.io/projected/94fd9d5d-ddb0-4b19-83c1-d8df802aec70-kube-api-access-5crxz\") pod \"csi-hostpathplugin-hqg8s\" (UID: \"94fd9d5d-ddb0-4b19-83c1-d8df802aec70\") " pod="hostpath-provisioner/csi-hostpathplugin-hqg8s" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.814185 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kbqcj" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.818450 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnwr8"] Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.823633 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dml5n" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.831818 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w2f2s" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.839863 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq7wb\" (UniqueName: \"kubernetes.io/projected/43dc6cdf-8ced-494c-8237-75ff4d23caba-kube-api-access-hq7wb\") pod \"machine-config-operator-74547568cd-bjmdk\" (UID: \"43dc6cdf-8ced-494c-8237-75ff4d23caba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjmdk" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.841767 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sdh6r" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.866028 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqnfr\" (UniqueName: \"kubernetes.io/projected/a686218d-cec1-409d-a394-a563b438fbf0-kube-api-access-pqnfr\") pod \"openshift-controller-manager-operator-756b6f6bc6-5bdsk\" (UID: \"a686218d-cec1-409d-a394-a563b438fbf0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5bdsk" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.883492 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk6bk\" (UniqueName: \"kubernetes.io/projected/6d40e8d8-8182-431a-94c5-a27d77b773e1-kube-api-access-qk6bk\") pod \"ingress-canary-dtz7g\" (UID: \"6d40e8d8-8182-431a-94c5-a27d77b773e1\") " pod="openshift-ingress-canary/ingress-canary-dtz7g" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.887768 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bwjbd" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.890341 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2pb67" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.895722 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6rtx7" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.903780 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5bdsk" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.904225 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctrj4\" (UniqueName: \"kubernetes.io/projected/56c261dd-49c4-4f69-9400-7a012e281b7b-kube-api-access-ctrj4\") pod \"route-controller-manager-6576b87f9c-qr5sm\" (UID: \"56c261dd-49c4-4f69-9400-7a012e281b7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.917233 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.917595 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h44gr"] Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.917672 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm" Dec 01 19:58:56 crc kubenswrapper[4802]: E1201 19:58:56.918225 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:58:57.418206863 +0000 UTC m=+158.980766504 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.928637 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8mbf\" (UniqueName: \"kubernetes.io/projected/4bdd38ce-c886-49bc-887c-b2936750731c-kube-api-access-k8mbf\") pod \"catalog-operator-68c6474976-cvht4\" (UID: \"4bdd38ce-c886-49bc-887c-b2936750731c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cvht4" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.931938 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjmdk" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.937861 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsvqt\" (UniqueName: \"kubernetes.io/projected/8b7d6d96-d0c1-4280-9f00-041935714120-kube-api-access-gsvqt\") pod \"machine-config-server-t4kb6\" (UID: \"8b7d6d96-d0c1-4280-9f00-041935714120\") " pod="openshift-machine-config-operator/machine-config-server-t4kb6" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.940507 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g7fvq" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.941395 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mn4k7"] Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.948667 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dtz7g" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.960557 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-t4kb6" Dec 01 19:58:56 crc kubenswrapper[4802]: I1201 19:58:56.976492 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-hqg8s" Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.003426 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6bdr9" Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.005396 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410305-db4dv"] Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.020479 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:57 crc kubenswrapper[4802]: E1201 19:58:57.020984 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:58:57.520973995 +0000 UTC m=+159.083533636 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.058829 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-s8fb5"] Dec 01 19:58:57 crc kubenswrapper[4802]: W1201 19:58:57.098067 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad1c23af_b7e5_4d78_b9c0_272b55237564.slice/crio-186d7fa25a14ed5d0931c865486b47cd86e5da66347e73001e80c892a0be3cde WatchSource:0}: Error finding container 186d7fa25a14ed5d0931c865486b47cd86e5da66347e73001e80c892a0be3cde: Status 404 returned error can't find the container with id 186d7fa25a14ed5d0931c865486b47cd86e5da66347e73001e80c892a0be3cde Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.122006 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:58:57 crc kubenswrapper[4802]: E1201 19:58:57.122159 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:58:57.622141507 +0000 UTC m=+159.184701138 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.122290 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:57 crc kubenswrapper[4802]: E1201 19:58:57.122547 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:58:57.62254001 +0000 UTC m=+159.185099651 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.126653 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-4czjq"] Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.135527 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8dcm"] Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.225684 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:58:57 crc kubenswrapper[4802]: E1201 19:58:57.226153 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:58:57.726136027 +0000 UTC m=+159.288695668 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.226293 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cvht4" Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.303639 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kbqcj"] Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.327267 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:57 crc kubenswrapper[4802]: E1201 19:58:57.327771 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:58:57.827760253 +0000 UTC m=+159.390319894 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.382359 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dn99n"] Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.432090 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:58:57 crc kubenswrapper[4802]: E1201 19:58:57.432754 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:58:57.932738884 +0000 UTC m=+159.495298525 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.514864 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pl7mh"] Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.516659 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sdh6r"] Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.534033 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:57 crc kubenswrapper[4802]: E1201 19:58:57.534355 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:58:58.034343671 +0000 UTC m=+159.596903312 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.540032 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-w2f2s"] Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.558485 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-qfg52" event={"ID":"417dfff6-8c73-474a-8ec2-aadab4e32131","Type":"ContainerStarted","Data":"b8bdedf0cc92ad0985ed438aa2c9c5c163b5120d0f878ab957c9749e58f390a3"} Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.558525 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-qfg52" event={"ID":"417dfff6-8c73-474a-8ec2-aadab4e32131","Type":"ContainerStarted","Data":"e59556c71f41d0dce94c0ae8ede97ed12402a3e525ae30da1f02543450f38405"} Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.559246 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-qfg52" Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.561625 4802 patch_prober.go:28] interesting pod/console-operator-58897d9998-qfg52 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.561687 4802 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-qfg52" podUID="417dfff6-8c73-474a-8ec2-aadab4e32131" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.562873 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h44gr" event={"ID":"3dac3049-5e67-48ba-8584-be24cbcfdd36","Type":"ContainerStarted","Data":"89a492d3f05f2c8b67f26d08336e46a559ed5a9560695134aed86cebf02c736c"} Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.588004 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410305-db4dv" event={"ID":"081dccc6-dbee-40a9-8333-d1c178e3fab3","Type":"ContainerStarted","Data":"b3fec8a010de90205c8ecf7c6f5117e54dc9e532318635280854a01617672ba2"} Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.609417 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mn4k7" event={"ID":"ad1c23af-b7e5-4d78-b9c0-272b55237564","Type":"ContainerStarted","Data":"186d7fa25a14ed5d0931c865486b47cd86e5da66347e73001e80c892a0be3cde"} Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.621285 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-s8fb5" event={"ID":"4dc8de31-caf7-493a-a966-64105cf2e8fc","Type":"ContainerStarted","Data":"6e7a6016959126e8891df8ec724ddefbaaa5c52ba30e8a5122d0ad9606016731"} Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.632077 4802 generic.go:334] "Generic (PLEG): container finished" podID="099f3158-3fcc-4d67-9ee2-1a9229e1ad23" containerID="7a728c73c70dd29dad9f4a612ce7da8fe316d4546df994cc4abf107f859d65ac" exitCode=0 Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.632315 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" event={"ID":"099f3158-3fcc-4d67-9ee2-1a9229e1ad23","Type":"ContainerDied","Data":"7a728c73c70dd29dad9f4a612ce7da8fe316d4546df994cc4abf107f859d65ac"} Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.635003 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:58:57 crc kubenswrapper[4802]: E1201 19:58:57.635285 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:58:58.135271686 +0000 UTC m=+159.697831327 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.635942 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-62kxj" event={"ID":"9f3283e5-38b2-4f3e-a0d4-122b734e79d4","Type":"ContainerStarted","Data":"b0db5e51eb0fcea0710fd46dd7bfd5394fec1eb11e12a9164997990571929467"} Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.635981 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-62kxj" event={"ID":"9f3283e5-38b2-4f3e-a0d4-122b734e79d4","Type":"ContainerStarted","Data":"0fbb1831ea417e8b84a035f0d6c021dc5b05cad0e34350f16864f4d0cf9eb227"} Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.645634 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-7cxs2" event={"ID":"21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2","Type":"ContainerStarted","Data":"7023b2a035d254f446bcc5fc798a430428c27a2e61efca45ebf56caa8122f52b"} Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.645671 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-7cxs2" event={"ID":"21ad3988-7e9c-4d83-9fad-1ecebbd7c9f2","Type":"ContainerStarted","Data":"7cf3193ae3aed2bf55fdaa8e408b4cd24bc4e2d911409fe88275dd1759d6202f"} Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.655157 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hv2h9" event={"ID":"6b139dad-bbb0-4d0f-bd11-14f142ef1767","Type":"ContainerStarted","Data":"02518d33f19f286d1413802aa22f08506e917a85a83e136bfdf024985b79ae6d"} Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.655233 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hv2h9" event={"ID":"6b139dad-bbb0-4d0f-bd11-14f142ef1767","Type":"ContainerStarted","Data":"f1d4bb3293dee0da48beccae98e4ccad27878e2d9e5ca330618b95a1cab82856"} Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.657389 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-hv2h9" Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.682998 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-hv2h9" Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.683963 4802 generic.go:334] "Generic (PLEG): container finished" podID="5a4d8cc9-1731-4087-8e6f-ac0e184616fe" containerID="41f1b767bdcf1cf59c90897484fdea7931bf5339553b9e01406154f4bbbc2ca1" exitCode=0 Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.684039 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r9w8p" event={"ID":"5a4d8cc9-1731-4087-8e6f-ac0e184616fe","Type":"ContainerDied","Data":"41f1b767bdcf1cf59c90897484fdea7931bf5339553b9e01406154f4bbbc2ca1"} Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.695415 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cdzf5" event={"ID":"efce51d1-6c15-4dde-a0e3-d86fc67c2f0f","Type":"ContainerStarted","Data":"1a6add2e40ea249b3e0333b69b4845cec6f43c52b43aaa4c5d8a86156dc8b177"} Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.710170 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-g42nq" event={"ID":"46b256be-246d-43ad-99b9-7b67eca6762a","Type":"ContainerStarted","Data":"fcf70435585c56b5c0740ff6817d3cb2c1460c11e06e888917692b3f53ffefec"} Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.710216 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-g42nq" event={"ID":"46b256be-246d-43ad-99b9-7b67eca6762a","Type":"ContainerStarted","Data":"5478a3e5484fe80b386bacd83f9643c8234eec27c4d7ecdbbc40dff7a9d79df0"} Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.727731 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-lvx7b" event={"ID":"7e98194d-958a-4c56-b5a3-90e01eab1816","Type":"ContainerStarted","Data":"3cae77215c85b596e22b22ac7fcd980e98fc57e05cac6d0f5f93c1c393aa3175"} Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.731812 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8dcm" event={"ID":"ed226f25-eec2-4393-961c-9c8d6011e8dc","Type":"ContainerStarted","Data":"2f59a6eb8c545a8867599f1f3fc5d81be92a86117cb06bf2a6351f5339934ff5"} Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.736532 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kbqcj" event={"ID":"fe0e9324-8a2b-499e-9fde-72e7f3effad9","Type":"ContainerStarted","Data":"8b494c93af8f314039d6fa9a3ae74d1147d0e328407563e48b28e8cde7ae2890"} Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.736668 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:57 crc kubenswrapper[4802]: E1201 19:58:57.737464 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:58:58.237452999 +0000 UTC m=+159.800012640 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.756041 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x847" event={"ID":"5217fe99-8018-4db8-8f1d-2e0b33056638","Type":"ContainerStarted","Data":"91f97fdce4b4a085ce59127988d58398dbb87d6ce4f8efc3e3862897ea674497"} Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.793320 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-7cxs2" podStartSLOduration=139.793302899 podStartE2EDuration="2m19.793302899s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:57.788418998 +0000 UTC m=+159.350978639" watchObservedRunningTime="2025-12-01 19:58:57.793302899 +0000 UTC m=+159.355862540" Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.806911 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c6jk6" event={"ID":"ec8148e4-5519-4b98-a17f-b80f1a44a4da","Type":"ContainerStarted","Data":"c636deae59f3a1536fc8ba01e6d6df5712e1bde564c91747552833d1e17bb921"} Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.807269 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c6jk6" event={"ID":"ec8148e4-5519-4b98-a17f-b80f1a44a4da","Type":"ContainerStarted","Data":"da632ce3e7106c7452c05779ed42514c7d11490e48997201abb36d7068fd3825"} Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.840072 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:58:57 crc kubenswrapper[4802]: E1201 19:58:57.842326 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:58:58.342305527 +0000 UTC m=+159.904865168 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.872213 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dn99n" event={"ID":"a7b1b1b8-14c6-4649-b791-1de21278aa35","Type":"ContainerStarted","Data":"daeca7f0c087aa519f26cdd7d94c8a81b7e84457e418dd8c6240250c6b62243c"} Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.890682 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-62kxj" podStartSLOduration=139.890665803 podStartE2EDuration="2m19.890665803s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:57.889960712 +0000 UTC m=+159.452520353" watchObservedRunningTime="2025-12-01 19:58:57.890665803 +0000 UTC m=+159.453225434" Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.892129 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-gjlgw" event={"ID":"6af22b89-378d-4aab-a028-2e19ec6e8d1c","Type":"ContainerStarted","Data":"5103160dfe680cb240c938526a07879cffddfa44ca5038f2253dac7a458b058b"} Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.892167 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-gjlgw" event={"ID":"6af22b89-378d-4aab-a028-2e19ec6e8d1c","Type":"ContainerStarted","Data":"6ab21c489b622a2d0e87f6e0c85d6b1d269057f18eb11cd682d4f9da32c8c770"} Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.892845 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-gjlgw" Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.897535 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nqwc4" event={"ID":"1cb3789f-6490-4c1c-a5a8-54f3d9ea3bb0","Type":"ContainerStarted","Data":"3082d14e8fbbca38c0bb9359d0dc711d0d482f81e9fc154f948e037906ebf1bc"} Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.897573 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nqwc4" event={"ID":"1cb3789f-6490-4c1c-a5a8-54f3d9ea3bb0","Type":"ContainerStarted","Data":"ca931cd35e54699cce25254c173bc2327a2bfa63f0d88fdb47b1148c841b9c76"} Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.927443 4802 patch_prober.go:28] interesting pod/downloads-7954f5f757-gjlgw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.927664 4802 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gjlgw" podUID="6af22b89-378d-4aab-a028-2e19ec6e8d1c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.943155 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:57 crc kubenswrapper[4802]: E1201 19:58:57.945284 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:58:58.445269604 +0000 UTC m=+160.007829245 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:57 crc kubenswrapper[4802]: I1201 19:58:57.969922 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnwr8" event={"ID":"bda97cb1-ea4c-499a-8549-dc62ae7e08c6","Type":"ContainerStarted","Data":"0a3c6d0807ba07dd52d6912a32333ded291500fd230d544ae7f69bdb4dd843c0"} Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.021743 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" event={"ID":"fcf38656-7a15-4cd1-9038-83272327ce3c","Type":"ContainerStarted","Data":"009a6bd806ef8a7c973e3ef5ba7b47df6869c43034f6de3d94b768caa6f7db72"} Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.021789 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" event={"ID":"fcf38656-7a15-4cd1-9038-83272327ce3c","Type":"ContainerStarted","Data":"33d7f8738f9f3ca5cd99f028801fa9a18310a1a2043316cf5360f850c2491ca9"} Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.022585 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.052890 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.053119 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-287bb" event={"ID":"d8d9c52e-a041-4e4c-a364-ef09f105a206","Type":"ContainerStarted","Data":"a6877498d400b059edc61d319cc897e68955474ce4bd284e473ff5ead66603ce"} Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.053169 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-287bb" event={"ID":"d8d9c52e-a041-4e4c-a364-ef09f105a206","Type":"ContainerStarted","Data":"ae9f3a0e06eb3cdd94493033ff74b873cb7e4260de4499db85b7f028a40738ea"} Dec 01 19:58:58 crc kubenswrapper[4802]: E1201 19:58:58.053793 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:58:58.553775524 +0000 UTC m=+160.116335165 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:58 crc kubenswrapper[4802]: W1201 19:58:58.075399 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79289702_5b61_4c95_9ed7_371d48b3cd4d.slice/crio-6a80df80e641001876b5e3b5631686c93c4f24146db0ea27f3968dc632d7226e WatchSource:0}: Error finding container 6a80df80e641001876b5e3b5631686c93c4f24146db0ea27f3968dc632d7226e: Status 404 returned error can't find the container with id 6a80df80e641001876b5e3b5631686c93c4f24146db0ea27f3968dc632d7226e Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.082425 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-4czjq" event={"ID":"18c8b374-04ab-4b76-93d5-dd84b990a54e","Type":"ContainerStarted","Data":"16780f053d8c2f34bf10de0a5b806adbf915260f0e312bb91d726e6c5cdf47d6"} Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.102941 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.102995 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.128286 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bwjbd"] Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.149618 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mx65z" event={"ID":"d2c2d1ec-c588-4247-aae2-c228404a38e0","Type":"ContainerStarted","Data":"a21f99daa5721c426e907538df6911d2ffa0a848bbdf33d85a481ba40ecf880b"} Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.149666 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mx65z" event={"ID":"d2c2d1ec-c588-4247-aae2-c228404a38e0","Type":"ContainerStarted","Data":"c6747fbbeff0177e5e480bde177f081fd9147d1c74f2019d456a55e8afcb75cc"} Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.159090 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:58 crc kubenswrapper[4802]: E1201 19:58:58.159685 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:58:58.659669432 +0000 UTC m=+160.222229073 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.229621 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-qfg52" podStartSLOduration=140.229605918 podStartE2EDuration="2m20.229605918s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:58.200689313 +0000 UTC m=+159.763248954" watchObservedRunningTime="2025-12-01 19:58:58.229605918 +0000 UTC m=+159.792165559" Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.259676 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:58:58 crc kubenswrapper[4802]: E1201 19:58:58.260039 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:58:58.760024259 +0000 UTC m=+160.322583900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.361388 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:58 crc kubenswrapper[4802]: E1201 19:58:58.362074 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:58:58.862055799 +0000 UTC m=+160.424615440 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:58 crc kubenswrapper[4802]: W1201 19:58:58.447795 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0953c320_4dd8_4914_a84d_01bf5e9f11aa.slice/crio-97730183cfc38a36abed6c76a3286d5ce203c7c2e8ff0c8fed8401c1a780ebfc WatchSource:0}: Error finding container 97730183cfc38a36abed6c76a3286d5ce203c7c2e8ff0c8fed8401c1a780ebfc: Status 404 returned error can't find the container with id 97730183cfc38a36abed6c76a3286d5ce203c7c2e8ff0c8fed8401c1a780ebfc Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.449037 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2pb67"] Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.492083 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dml5n"] Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.499856 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:58:58 crc kubenswrapper[4802]: E1201 19:58:58.500226 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:58:59.000210037 +0000 UTC m=+160.562769678 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.584945 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6rtx7"] Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.602029 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:58 crc kubenswrapper[4802]: E1201 19:58:58.602366 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:58:59.10235526 +0000 UTC m=+160.664914901 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.608025 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm"] Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.614682 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hqg8s"] Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.704956 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:58:58 crc kubenswrapper[4802]: E1201 19:58:58.705359 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:58:59.205343558 +0000 UTC m=+160.767903199 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.752613 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-hv2h9" podStartSLOduration=140.752591911 podStartE2EDuration="2m20.752591911s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:58.731684354 +0000 UTC m=+160.294243995" watchObservedRunningTime="2025-12-01 19:58:58.752591911 +0000 UTC m=+160.315151552" Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.756430 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5bdsk"] Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.769148 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-g7fvq"] Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.806234 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:58 crc kubenswrapper[4802]: E1201 19:58:58.806527 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:58:59.306516201 +0000 UTC m=+160.869075832 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:58 crc kubenswrapper[4802]: W1201 19:58:58.816705 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda686218d_cec1_409d_a394_a563b438fbf0.slice/crio-abe978237e35541c2c953bc9d0b1ed351f6e9b2e4ce3c7521648bdd1e763a73e WatchSource:0}: Error finding container abe978237e35541c2c953bc9d0b1ed351f6e9b2e4ce3c7521648bdd1e763a73e: Status 404 returned error can't find the container with id abe978237e35541c2c953bc9d0b1ed351f6e9b2e4ce3c7521648bdd1e763a73e Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.839067 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bjmdk"] Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.840047 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dtz7g"] Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.855402 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c6jk6" podStartSLOduration=141.855381455 podStartE2EDuration="2m21.855381455s" podCreationTimestamp="2025-12-01 19:56:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:58.838785671 +0000 UTC m=+160.401345312" watchObservedRunningTime="2025-12-01 19:58:58.855381455 +0000 UTC m=+160.417941096" Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.861616 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cvht4"] Dec 01 19:58:58 crc kubenswrapper[4802]: W1201 19:58:58.908903 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff03dfdf_cc89_44be_80ec_33df2a4c006b.slice/crio-3e64a26982564ed747595e981c4278b8e0a61d6ad5d186af256c5b060d8a0dc3 WatchSource:0}: Error finding container 3e64a26982564ed747595e981c4278b8e0a61d6ad5d186af256c5b060d8a0dc3: Status 404 returned error can't find the container with id 3e64a26982564ed747595e981c4278b8e0a61d6ad5d186af256c5b060d8a0dc3 Dec 01 19:58:58 crc kubenswrapper[4802]: I1201 19:58:58.909668 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:58.929285 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-6bdr9"] Dec 01 19:58:59 crc kubenswrapper[4802]: E1201 19:58:58.931503 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:58:59.431479611 +0000 UTC m=+160.994039252 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.023459 4802 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-nsqcw container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.26:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.023845 4802 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" podUID="fcf38656-7a15-4cd1-9038-83272327ce3c" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.26:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.035249 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:59 crc kubenswrapper[4802]: E1201 19:58:59.035570 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:58:59.535557663 +0000 UTC m=+161.098117304 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.115446 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x847" podStartSLOduration=141.115430146 podStartE2EDuration="2m21.115430146s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:59.061036812 +0000 UTC m=+160.623596453" watchObservedRunningTime="2025-12-01 19:58:59.115430146 +0000 UTC m=+160.677989787" Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.139135 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:58:59 crc kubenswrapper[4802]: E1201 19:58:59.139770 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:58:59.639739939 +0000 UTC m=+161.202299580 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.225402 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-287bb" podStartSLOduration=140.225383121 podStartE2EDuration="2m20.225383121s" podCreationTimestamp="2025-12-01 19:56:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:59.219777497 +0000 UTC m=+160.782337138" watchObservedRunningTime="2025-12-01 19:58:59.225383121 +0000 UTC m=+160.787942762" Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.241372 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:59 crc kubenswrapper[4802]: E1201 19:58:59.241778 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:58:59.741758298 +0000 UTC m=+161.304317939 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.242485 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cdzf5" event={"ID":"efce51d1-6c15-4dde-a0e3-d86fc67c2f0f","Type":"ContainerStarted","Data":"561d0ba4624130fc28c6625bcea4b99d84dab94964069047d161c7df76e344cb"} Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.306324 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-lvx7b" event={"ID":"7e98194d-958a-4c56-b5a3-90e01eab1816","Type":"ContainerStarted","Data":"d5d9913e09167f36f60f8d97a1f136f9f230925da2f8f91cdb5e091e0131571f"} Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.312781 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" podStartSLOduration=141.312748415 podStartE2EDuration="2m21.312748415s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:59.306219504 +0000 UTC m=+160.868779145" watchObservedRunningTime="2025-12-01 19:58:59.312748415 +0000 UTC m=+160.875308056" Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.323572 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8dcm" event={"ID":"ed226f25-eec2-4393-961c-9c8d6011e8dc","Type":"ContainerStarted","Data":"24a103791b3a310a799e38c685fb048a32e8d60783747dd51299344410c1b69f"} Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.343977 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:58:59 crc kubenswrapper[4802]: E1201 19:58:59.344312 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:58:59.844284032 +0000 UTC m=+161.406843673 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.344678 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:59 crc kubenswrapper[4802]: E1201 19:58:59.346807 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:58:59.846584163 +0000 UTC m=+161.409143804 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.355689 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sdh6r" event={"ID":"79289702-5b61-4c95-9ed7-371d48b3cd4d","Type":"ContainerStarted","Data":"3d425a075e63b8646b2f44c486cde8f69120a89ed974a4b8c5de0965a1610a2f"} Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.355752 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sdh6r" event={"ID":"79289702-5b61-4c95-9ed7-371d48b3cd4d","Type":"ContainerStarted","Data":"6a80df80e641001876b5e3b5631686c93c4f24146db0ea27f3968dc632d7226e"} Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.356479 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sdh6r" Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.391224 4802 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-sdh6r container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.391284 4802 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sdh6r" podUID="79289702-5b61-4c95-9ed7-371d48b3cd4d" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.391668 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pl7mh" event={"ID":"b6da7bbe-5c7b-42a0-8d53-f726bf6c3d9c","Type":"ContainerStarted","Data":"34ce64565b5d6c7fa5c9e3026dfab9196b6ae9cff6af4656046bb2ea0080253e"} Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.391732 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pl7mh" event={"ID":"b6da7bbe-5c7b-42a0-8d53-f726bf6c3d9c","Type":"ContainerStarted","Data":"5311a50d043b7f90b91be18a7c562a5ace68e99633444d4e883cb7a3123a2ec1"} Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.455235 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.460404 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g7fvq" event={"ID":"ff03dfdf-cc89-44be-80ec-33df2a4c006b","Type":"ContainerStarted","Data":"3e64a26982564ed747595e981c4278b8e0a61d6ad5d186af256c5b060d8a0dc3"} Dec 01 19:58:59 crc kubenswrapper[4802]: E1201 19:58:59.461734 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:58:59.961703888 +0000 UTC m=+161.524263529 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.472270 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-gjlgw" podStartSLOduration=141.472186572 podStartE2EDuration="2m21.472186572s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:59.444483194 +0000 UTC m=+161.007042835" watchObservedRunningTime="2025-12-01 19:58:59.472186572 +0000 UTC m=+161.034746213" Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.472960 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pl7mh" podStartSLOduration=141.472951146 podStartE2EDuration="2m21.472951146s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:59.471147171 +0000 UTC m=+161.033706812" watchObservedRunningTime="2025-12-01 19:58:59.472951146 +0000 UTC m=+161.035510787" Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.503831 4802 generic.go:334] "Generic (PLEG): container finished" podID="18c8b374-04ab-4b76-93d5-dd84b990a54e" containerID="eed3df65c6a257e1c360087c655cc14919d6319a75257853701ba2f0245f4aae" exitCode=0 Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.503972 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-4czjq" event={"ID":"18c8b374-04ab-4b76-93d5-dd84b990a54e","Type":"ContainerDied","Data":"eed3df65c6a257e1c360087c655cc14919d6319a75257853701ba2f0245f4aae"} Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.512128 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dn99n" event={"ID":"a7b1b1b8-14c6-4649-b791-1de21278aa35","Type":"ContainerStarted","Data":"0c171e085b85031983142f7a72f811ca507a7703121dc0affe35178abbcfa922"} Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.518035 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dml5n" event={"ID":"2a619835-3b06-4cd8-8a68-87c6a1a997c5","Type":"ContainerStarted","Data":"c422fe6b684c006c2d4867c1d6d67fab3264db182e67d582e2cba2146a7a1649"} Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.525982 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnwr8" event={"ID":"bda97cb1-ea4c-499a-8549-dc62ae7e08c6","Type":"ContainerStarted","Data":"ef297f930a831e4c2818c4163b4deb2149f8b1ffb9068a46e5828ef8b7f29e8a"} Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.539713 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s8dcm" podStartSLOduration=141.539688992 podStartE2EDuration="2m21.539688992s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:59.490040585 +0000 UTC m=+161.052600226" watchObservedRunningTime="2025-12-01 19:58:59.539688992 +0000 UTC m=+161.102248633" Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.546486 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nqwc4" event={"ID":"1cb3789f-6490-4c1c-a5a8-54f3d9ea3bb0","Type":"ContainerStarted","Data":"225f059fe886fdf9f1803fc11caac7d3dfe9f6b9102957ab31483d2b036eed53"} Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.563046 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:59 crc kubenswrapper[4802]: E1201 19:58:59.563506 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:59:00.06348842 +0000 UTC m=+161.626048061 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.587724 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-t4kb6" event={"ID":"8b7d6d96-d0c1-4280-9f00-041935714120","Type":"ContainerStarted","Data":"afc71d8b2844ee17807b5c8133ed54c6e63bd9ee7286e20e3cec605cf8c85b00"} Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.588141 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-t4kb6" event={"ID":"8b7d6d96-d0c1-4280-9f00-041935714120","Type":"ContainerStarted","Data":"08346d74df5ff480cdfb347f5ae6f8401da18db84d244f4f4121df438ff6ba41"} Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.600698 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sdh6r" podStartSLOduration=140.600680422 podStartE2EDuration="2m20.600680422s" podCreationTimestamp="2025-12-01 19:56:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:59.530564341 +0000 UTC m=+161.093123992" watchObservedRunningTime="2025-12-01 19:58:59.600680422 +0000 UTC m=+161.163240063" Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.601887 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-lvx7b" podStartSLOduration=141.601878838 podStartE2EDuration="2m21.601878838s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:59.599426752 +0000 UTC m=+161.161986393" watchObservedRunningTime="2025-12-01 19:58:59.601878838 +0000 UTC m=+161.164438489" Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.607666 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjmdk" event={"ID":"43dc6cdf-8ced-494c-8237-75ff4d23caba","Type":"ContainerStarted","Data":"d03d07f18665095b36c0bf32730a1ac075578dd1245fc610b75cd1611b6335b5"} Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.659137 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mx65z" event={"ID":"d2c2d1ec-c588-4247-aae2-c228404a38e0","Type":"ContainerStarted","Data":"934e1fb2bd7122f7bbea7bccc643d14db9eaeab354868a9a10d38d85f3cbe0d2"} Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.665833 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:58:59 crc kubenswrapper[4802]: E1201 19:58:59.667669 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:59:00.167653875 +0000 UTC m=+161.730213516 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.705463 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5bdsk" event={"ID":"a686218d-cec1-409d-a394-a563b438fbf0","Type":"ContainerStarted","Data":"abe978237e35541c2c953bc9d0b1ed351f6e9b2e4ce3c7521648bdd1e763a73e"} Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.732387 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2pb67" event={"ID":"0953c320-4dd8-4914-a84d-01bf5e9f11aa","Type":"ContainerStarted","Data":"97730183cfc38a36abed6c76a3286d5ce203c7c2e8ff0c8fed8401c1a780ebfc"} Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.765840 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jnwr8" podStartSLOduration=141.765811225 podStartE2EDuration="2m21.765811225s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:59.762812072 +0000 UTC m=+161.325371713" watchObservedRunningTime="2025-12-01 19:58:59.765811225 +0000 UTC m=+161.328370866" Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.786520 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:58:59 crc kubenswrapper[4802]: E1201 19:58:59.786896 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:59:00.286876346 +0000 UTC m=+161.849435987 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.787287 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-t4kb6" podStartSLOduration=6.787249228 podStartE2EDuration="6.787249228s" podCreationTimestamp="2025-12-01 19:58:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:59.684886228 +0000 UTC m=+161.247445869" watchObservedRunningTime="2025-12-01 19:58:59.787249228 +0000 UTC m=+161.349808869" Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.810496 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-lvx7b" Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.836077 4802 patch_prober.go:28] interesting pod/router-default-5444994796-lvx7b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 19:58:59 crc kubenswrapper[4802]: [-]has-synced failed: reason withheld Dec 01 19:58:59 crc kubenswrapper[4802]: [+]process-running ok Dec 01 19:58:59 crc kubenswrapper[4802]: healthz check failed Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.836207 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvx7b" podUID="7e98194d-958a-4c56-b5a3-90e01eab1816" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.847812 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mn4k7" event={"ID":"ad1c23af-b7e5-4d78-b9c0-272b55237564","Type":"ContainerStarted","Data":"7059020ab5a1b824d596a1cbe4f675d4d70b641e866f48c99037121eb0096c98"} Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.884420 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-c6jk6" event={"ID":"ec8148e4-5519-4b98-a17f-b80f1a44a4da","Type":"ContainerStarted","Data":"a4d3cdefc542e31ae091d39ca74d8bdae10b2158c49a025a4c62784a77e13f7f"} Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.890715 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:58:59 crc kubenswrapper[4802]: E1201 19:58:59.891142 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:59:00.391125764 +0000 UTC m=+161.953685405 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.892808 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nqwc4" podStartSLOduration=141.892799506 podStartE2EDuration="2m21.892799506s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:59.889341638 +0000 UTC m=+161.451901279" watchObservedRunningTime="2025-12-01 19:58:59.892799506 +0000 UTC m=+161.455359147" Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.913599 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410305-db4dv" event={"ID":"081dccc6-dbee-40a9-8333-d1c178e3fab3","Type":"ContainerStarted","Data":"14ba7088961527c5c9887e5909d2898863b4b063e7bb01065f35614ce104b714"} Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.932635 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6rtx7" event={"ID":"4c9f8f21-904c-465c-8acb-5e552719a02f","Type":"ContainerStarted","Data":"59aa53ae9df9bc3b5c09ea1779a7eab5ef04476a75fdeaf926b6932d16958127"} Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.937567 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-mx65z" podStartSLOduration=140.937549741 podStartE2EDuration="2m20.937549741s" podCreationTimestamp="2025-12-01 19:56:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:59.937057476 +0000 UTC m=+161.499617107" watchObservedRunningTime="2025-12-01 19:58:59.937549741 +0000 UTC m=+161.500109382" Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.938869 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-s8fb5" event={"ID":"4dc8de31-caf7-493a-a966-64105cf2e8fc","Type":"ContainerStarted","Data":"9f5b2976a266a38fe17c298bc53c5910a60047ee5ae2b80fbfef1e3972925b0f"} Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.970667 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r9w8p" event={"ID":"5a4d8cc9-1731-4087-8e6f-ac0e184616fe","Type":"ContainerStarted","Data":"b96d8542c6d213a44df1825c159d7e8215d9113200c64a4fe82b288781d9fe91"} Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.971453 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r9w8p" Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.984776 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29410305-db4dv" podStartSLOduration=141.984759053 podStartE2EDuration="2m21.984759053s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:58:59.982818843 +0000 UTC m=+161.545378474" watchObservedRunningTime="2025-12-01 19:58:59.984759053 +0000 UTC m=+161.547318694" Dec 01 19:58:59 crc kubenswrapper[4802]: I1201 19:58:59.993168 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:58:59.997482 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dtz7g" event={"ID":"6d40e8d8-8182-431a-94c5-a27d77b773e1","Type":"ContainerStarted","Data":"5ab329af6a6dc9a41ea3cc1db8bb4426a6d7f5f0a9be36cca5300138847361b4"} Dec 01 19:59:00 crc kubenswrapper[4802]: E1201 19:58:59.998524 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:59:00.498509618 +0000 UTC m=+162.061069259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.011379 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h44gr" event={"ID":"3dac3049-5e67-48ba-8584-be24cbcfdd36","Type":"ContainerStarted","Data":"6f1325b4bd1951295e14ad22f0346cedbca632cb774c9641f22860d51e6000e6"} Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.037366 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cvht4" event={"ID":"4bdd38ce-c886-49bc-887c-b2936750731c","Type":"ContainerStarted","Data":"880789daa0bafd66b3585e1e25a5cf7527d6781412e61d70e91d83e5804317a7"} Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.061604 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w2f2s" event={"ID":"04f115d4-aeb2-4fa4-b561-095eff3a18ea","Type":"ContainerStarted","Data":"1c9ed557de7b8639cf8bcc53d4540cd867d94c06d467e40c206cc1b85c4828a2"} Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.061644 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w2f2s" event={"ID":"04f115d4-aeb2-4fa4-b561-095eff3a18ea","Type":"ContainerStarted","Data":"a592362b8e70ac5c2b9cbd47bc67b7d2042f596e59fa187d29bc477c08ccbb5a"} Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.063052 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-s8fb5" podStartSLOduration=142.063034726 podStartE2EDuration="2m22.063034726s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:00.061333184 +0000 UTC m=+161.623892825" watchObservedRunningTime="2025-12-01 19:59:00.063034726 +0000 UTC m=+161.625594357" Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.063476 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r9w8p" podStartSLOduration=142.06346995 podStartE2EDuration="2m22.06346995s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:00.020929253 +0000 UTC m=+161.583488894" watchObservedRunningTime="2025-12-01 19:59:00.06346995 +0000 UTC m=+161.626029591" Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.090641 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kbqcj" event={"ID":"fe0e9324-8a2b-499e-9fde-72e7f3effad9","Type":"ContainerStarted","Data":"3d9279e427003b09dc205b5144a58a255cdcda47cb30becd11fae9912b48ef94"} Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.095489 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:59:00 crc kubenswrapper[4802]: E1201 19:59:00.096826 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:59:00.596794742 +0000 UTC m=+162.159354383 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.099892 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-h44gr" podStartSLOduration=142.099866727 podStartE2EDuration="2m22.099866727s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:00.089685822 +0000 UTC m=+161.652245463" watchObservedRunningTime="2025-12-01 19:59:00.099866727 +0000 UTC m=+161.662426368" Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.139448 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hqg8s" event={"ID":"94fd9d5d-ddb0-4b19-83c1-d8df802aec70","Type":"ContainerStarted","Data":"a4f7e927cca535a9a5ae6ae3290c4199b620afd0f8fc500b86a2614efc698108"} Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.172277 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm" event={"ID":"56c261dd-49c4-4f69-9400-7a012e281b7b","Type":"ContainerStarted","Data":"db33ca0ca1be15ce759f7d676072688a219b463d93e03cc53afd0fd950f62d32"} Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.197803 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:59:00 crc kubenswrapper[4802]: E1201 19:59:00.198108 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:59:00.698095599 +0000 UTC m=+162.260655230 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.216458 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bwjbd" event={"ID":"a5bc347e-a789-4ce0-8800-2c322039d4a5","Type":"ContainerStarted","Data":"5bffc1719e00c912114e77088ff114ca119478c5d9cef5be4a891c43cbe0b0fb"} Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.217225 4802 patch_prober.go:28] interesting pod/downloads-7954f5f757-gjlgw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.217261 4802 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gjlgw" podUID="6af22b89-378d-4aab-a028-2e19ec6e8d1c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.217746 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bwjbd" Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.223356 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.224993 4802 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-bwjbd container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.225047 4802 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bwjbd" podUID="a5bc347e-a789-4ce0-8800-2c322039d4a5" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.263866 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kbqcj" podStartSLOduration=142.263845425 podStartE2EDuration="2m22.263845425s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:00.127614246 +0000 UTC m=+161.690173887" watchObservedRunningTime="2025-12-01 19:59:00.263845425 +0000 UTC m=+161.826405066" Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.273488 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-qfg52" Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.277428 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bwjbd" podStartSLOduration=141.277405224 podStartE2EDuration="2m21.277405224s" podCreationTimestamp="2025-12-01 19:56:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:00.263625978 +0000 UTC m=+161.826185619" watchObservedRunningTime="2025-12-01 19:59:00.277405224 +0000 UTC m=+161.839964865" Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.302083 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:59:00 crc kubenswrapper[4802]: E1201 19:59:00.304186 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:59:00.804171853 +0000 UTC m=+162.366731484 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.406088 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:59:00 crc kubenswrapper[4802]: E1201 19:59:00.443263 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:59:00.94323546 +0000 UTC m=+162.505795101 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.507222 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:59:00 crc kubenswrapper[4802]: E1201 19:59:00.507853 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:59:01.007836749 +0000 UTC m=+162.570396390 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.609615 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:59:00 crc kubenswrapper[4802]: E1201 19:59:00.609930 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:59:01.10991811 +0000 UTC m=+162.672477751 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.710349 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:59:00 crc kubenswrapper[4802]: E1201 19:59:00.710534 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:59:01.210509645 +0000 UTC m=+162.773069286 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.710589 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:59:00 crc kubenswrapper[4802]: E1201 19:59:00.710913 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:59:01.210902217 +0000 UTC m=+162.773461848 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.811402 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.811732 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008be62d-2cef-42a3-912f-2b2e58f8e30b-metrics-certs\") pod \"network-metrics-daemon-p8cs7\" (UID: \"008be62d-2cef-42a3-912f-2b2e58f8e30b\") " pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:59:00 crc kubenswrapper[4802]: E1201 19:59:00.811784 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:59:01.31175794 +0000 UTC m=+162.874317581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.817422 4802 patch_prober.go:28] interesting pod/router-default-5444994796-lvx7b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 19:59:00 crc kubenswrapper[4802]: [-]has-synced failed: reason withheld Dec 01 19:59:00 crc kubenswrapper[4802]: [+]process-running ok Dec 01 19:59:00 crc kubenswrapper[4802]: healthz check failed Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.817480 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvx7b" podUID="7e98194d-958a-4c56-b5a3-90e01eab1816" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.820689 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/008be62d-2cef-42a3-912f-2b2e58f8e30b-metrics-certs\") pod \"network-metrics-daemon-p8cs7\" (UID: \"008be62d-2cef-42a3-912f-2b2e58f8e30b\") " pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.842679 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8cs7" Dec 01 19:59:00 crc kubenswrapper[4802]: I1201 19:59:00.912904 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:59:00 crc kubenswrapper[4802]: E1201 19:59:00.913314 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:59:01.413294984 +0000 UTC m=+162.975854685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.014053 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:59:01 crc kubenswrapper[4802]: E1201 19:59:01.014150 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:59:01.514134267 +0000 UTC m=+163.076693908 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.014324 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:59:01 crc kubenswrapper[4802]: E1201 19:59:01.014584 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:59:01.514576999 +0000 UTC m=+163.077136640 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.115963 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:59:01 crc kubenswrapper[4802]: E1201 19:59:01.116270 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:59:01.616246218 +0000 UTC m=+163.178805859 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.116644 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:59:01 crc kubenswrapper[4802]: E1201 19:59:01.116963 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:59:01.61695523 +0000 UTC m=+163.179514871 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.218343 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:59:01 crc kubenswrapper[4802]: E1201 19:59:01.218857 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:59:01.718842495 +0000 UTC m=+163.281402136 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.269743 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-g42nq" event={"ID":"46b256be-246d-43ad-99b9-7b67eca6762a","Type":"ContainerStarted","Data":"790f8543bb405b447ecf9b9dcfe0dbd7eb253f5b2bf52247c74cb52ac121fe5a"} Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.287562 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6bdr9" event={"ID":"921a6a6c-bdb5-4e35-8428-17fae5f50192","Type":"ContainerStarted","Data":"080ed7b22c519bc8bc3e2c75f62bc5be4241f878699adeadadcb1b7c03f8055d"} Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.287597 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6bdr9" event={"ID":"921a6a6c-bdb5-4e35-8428-17fae5f50192","Type":"ContainerStarted","Data":"068569dd7ebfe439a18ebe2033a6d3be9b588b04110f60a557e79be3279c601d"} Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.304554 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bwjbd" event={"ID":"a5bc347e-a789-4ce0-8800-2c322039d4a5","Type":"ContainerStarted","Data":"051bf20f567adde465b3892fbfe28909ae023090f7e3f7fe52299f5f1ad78340"} Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.313644 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dml5n" event={"ID":"2a619835-3b06-4cd8-8a68-87c6a1a997c5","Type":"ContainerStarted","Data":"376eec041725e2a8beb4f8a345dfeca81a06c59d039083910bca30e57c927988"} Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.313688 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dml5n" event={"ID":"2a619835-3b06-4cd8-8a68-87c6a1a997c5","Type":"ContainerStarted","Data":"ac330cdafc3241b3517a38179c95e403b4185ff93affc229b0d08ea9f3797b3c"} Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.314223 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dml5n" Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.320958 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:59:01 crc kubenswrapper[4802]: E1201 19:59:01.321243 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:59:01.821231675 +0000 UTC m=+163.383791316 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.341440 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mn4k7" event={"ID":"ad1c23af-b7e5-4d78-b9c0-272b55237564","Type":"ContainerStarted","Data":"a681b5995ae0972bdd60b4916d7d8b1e2a474be38c9f58d4d3f12442eee9859f"} Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.363411 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g7fvq" event={"ID":"ff03dfdf-cc89-44be-80ec-33df2a4c006b","Type":"ContainerStarted","Data":"532853a23af2f5ec61e23d7b4b90d4c2800a49358bfdf883502e12cadca44bc8"} Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.363456 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g7fvq" event={"ID":"ff03dfdf-cc89-44be-80ec-33df2a4c006b","Type":"ContainerStarted","Data":"bf57be12aff642f6447dfbae13ccba73ef809d7917718f6f4920e0f414f5a86e"} Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.363870 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-g7fvq" Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.377719 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hqg8s" event={"ID":"94fd9d5d-ddb0-4b19-83c1-d8df802aec70","Type":"ContainerStarted","Data":"d46b7dd7dc8663f506e707dada8ee615fc70fd87ac9c7d3de30b56524c41e5a8"} Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.386079 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-g42nq" podStartSLOduration=143.386062913 podStartE2EDuration="2m23.386062913s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:01.327619283 +0000 UTC m=+162.890178924" watchObservedRunningTime="2025-12-01 19:59:01.386062913 +0000 UTC m=+162.948622554" Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.388133 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-6bdr9" podStartSLOduration=142.388123776 podStartE2EDuration="2m22.388123776s" podCreationTimestamp="2025-12-01 19:56:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:01.38469241 +0000 UTC m=+162.947252051" watchObservedRunningTime="2025-12-01 19:59:01.388123776 +0000 UTC m=+162.950683417" Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.397623 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-p8cs7"] Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.402754 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5bdsk" event={"ID":"a686218d-cec1-409d-a394-a563b438fbf0","Type":"ContainerStarted","Data":"9f4d1e2cf251541824148e24c5cfdd0abf372c9029743356dffa2629fc838370"} Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.417425 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2pb67" event={"ID":"0953c320-4dd8-4914-a84d-01bf5e9f11aa","Type":"ContainerStarted","Data":"55d95a4424fd1ba662cdc14de1df516e788799a714524c93676d9f5d58cef4dd"} Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.417664 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2pb67" Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.421619 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:59:01 crc kubenswrapper[4802]: E1201 19:59:01.423264 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:59:01.923245855 +0000 UTC m=+163.485805486 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.427321 4802 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-2pb67 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/healthz\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.427381 4802 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-2pb67" podUID="0953c320-4dd8-4914-a84d-01bf5e9f11aa" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.16:8080/healthz\": dial tcp 10.217.0.16:8080: connect: connection refused" Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.453563 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dtz7g" event={"ID":"6d40e8d8-8182-431a-94c5-a27d77b773e1","Type":"ContainerStarted","Data":"135d39ec4ba5b8f74cc2f45ea78b135cff8d0910ed55f6d3ea70fb459d9d9b73"} Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.489494 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6rtx7" event={"ID":"4c9f8f21-904c-465c-8acb-5e552719a02f","Type":"ContainerStarted","Data":"255771d94a3b60ac959a292e4e087a134c26f0115c96c79da9aecedfaf7c48dc"} Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.520837 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" event={"ID":"099f3158-3fcc-4d67-9ee2-1a9229e1ad23","Type":"ContainerStarted","Data":"dd448dd1c271e3c387caed931ed257225503c33c950cf30787fe6caad748232b"} Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.523010 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.524155 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm" event={"ID":"56c261dd-49c4-4f69-9400-7a012e281b7b","Type":"ContainerStarted","Data":"99717dba24628eb04984c7e6568fed381d6a91f494eb675aaef265a7e6e83292"} Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.524814 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm" Dec 01 19:59:01 crc kubenswrapper[4802]: E1201 19:59:01.525040 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:59:02.025028895 +0000 UTC m=+163.587588536 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.537011 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dn99n" event={"ID":"a7b1b1b8-14c6-4649-b791-1de21278aa35","Type":"ContainerStarted","Data":"5b46146a66a3b23b5c30f17e96104b462909e20846d0c43e0fbcee703b12e7b6"} Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.549441 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cvht4" event={"ID":"4bdd38ce-c886-49bc-887c-b2936750731c","Type":"ContainerStarted","Data":"90accd61f92a670858a6f28b0efb373fd02dd5b0e916c1a54d7d3b6906d74600"} Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.549987 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cvht4" Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.562758 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-2pb67" podStartSLOduration=142.562748974 podStartE2EDuration="2m22.562748974s" podCreationTimestamp="2025-12-01 19:56:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:01.546852281 +0000 UTC m=+163.109411922" watchObservedRunningTime="2025-12-01 19:59:01.562748974 +0000 UTC m=+163.125308615" Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.563691 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dml5n" podStartSLOduration=142.563686262 podStartE2EDuration="2m22.563686262s" podCreationTimestamp="2025-12-01 19:56:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:01.45313153 +0000 UTC m=+163.015691171" watchObservedRunningTime="2025-12-01 19:59:01.563686262 +0000 UTC m=+163.126245903" Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.564119 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm" Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.576735 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cvht4" Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.591619 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w2f2s" event={"ID":"04f115d4-aeb2-4fa4-b561-095eff3a18ea","Type":"ContainerStarted","Data":"b1d6748389f9dac3795e0ef1982dfb366cfa4262b16ce131875b1aa9530b3689"} Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.598426 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cdzf5" event={"ID":"efce51d1-6c15-4dde-a0e3-d86fc67c2f0f","Type":"ContainerStarted","Data":"ba106585967023ab7dcd56c480afd715b575970ae884f86600df49e60923ec2f"} Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.602549 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjmdk" event={"ID":"43dc6cdf-8ced-494c-8237-75ff4d23caba","Type":"ContainerStarted","Data":"26d29731517657e64d518f020509db050c084b72f340bde82b934ba56f80256e"} Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.602605 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjmdk" event={"ID":"43dc6cdf-8ced-494c-8237-75ff4d23caba","Type":"ContainerStarted","Data":"06f1e76e8b0b5e74bdee4182d3edbe34618ecba3bbf512d70ef995d216bde15e"} Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.606338 4802 patch_prober.go:28] interesting pod/downloads-7954f5f757-gjlgw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.606382 4802 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gjlgw" podUID="6af22b89-378d-4aab-a028-2e19ec6e8d1c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.625003 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-g7fvq" podStartSLOduration=8.62498915 podStartE2EDuration="8.62498915s" podCreationTimestamp="2025-12-01 19:58:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:01.624889327 +0000 UTC m=+163.187448968" watchObservedRunningTime="2025-12-01 19:59:01.62498915 +0000 UTC m=+163.187548791" Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.625686 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:59:01 crc kubenswrapper[4802]: E1201 19:59:01.626826 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:59:02.126813277 +0000 UTC m=+163.689372918 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.628555 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r9w8p" Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.640779 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sdh6r" Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.665219 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5bdsk" podStartSLOduration=143.665206606 podStartE2EDuration="2m23.665206606s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:01.663989099 +0000 UTC m=+163.226548740" watchObservedRunningTime="2025-12-01 19:59:01.665206606 +0000 UTC m=+163.227766247" Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.689818 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-mn4k7" podStartSLOduration=142.689801587 podStartE2EDuration="2m22.689801587s" podCreationTimestamp="2025-12-01 19:56:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:01.68856444 +0000 UTC m=+163.251124081" watchObservedRunningTime="2025-12-01 19:59:01.689801587 +0000 UTC m=+163.252361228" Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.713256 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm" podStartSLOduration=142.713241593 podStartE2EDuration="2m22.713241593s" podCreationTimestamp="2025-12-01 19:56:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:01.711615553 +0000 UTC m=+163.274175194" watchObservedRunningTime="2025-12-01 19:59:01.713241593 +0000 UTC m=+163.275801234" Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.727449 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:59:01 crc kubenswrapper[4802]: E1201 19:59:01.759288 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:59:02.259270478 +0000 UTC m=+163.821830119 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.843002 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.843403 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bwjbd" Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.843985 4802 patch_prober.go:28] interesting pod/router-default-5444994796-lvx7b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 19:59:01 crc kubenswrapper[4802]: [-]has-synced failed: reason withheld Dec 01 19:59:01 crc kubenswrapper[4802]: [+]process-running ok Dec 01 19:59:01 crc kubenswrapper[4802]: healthz check failed Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.844011 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvx7b" podUID="7e98194d-958a-4c56-b5a3-90e01eab1816" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 19:59:01 crc kubenswrapper[4802]: E1201 19:59:01.854403 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:59:02.354358833 +0000 UTC m=+163.916918474 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.870953 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjmdk" podStartSLOduration=143.870933887 podStartE2EDuration="2m23.870933887s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:01.868917223 +0000 UTC m=+163.431476864" watchObservedRunningTime="2025-12-01 19:59:01.870933887 +0000 UTC m=+163.433493528" Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.889572 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" podStartSLOduration=142.889559533 podStartE2EDuration="2m22.889559533s" podCreationTimestamp="2025-12-01 19:56:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:01.889044277 +0000 UTC m=+163.451603928" watchObservedRunningTime="2025-12-01 19:59:01.889559533 +0000 UTC m=+163.452119174" Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.941997 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dtz7g" podStartSLOduration=8.941980186 podStartE2EDuration="8.941980186s" podCreationTimestamp="2025-12-01 19:58:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:01.935477665 +0000 UTC m=+163.498037306" watchObservedRunningTime="2025-12-01 19:59:01.941980186 +0000 UTC m=+163.504539827" Dec 01 19:59:01 crc kubenswrapper[4802]: I1201 19:59:01.969414 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:59:01 crc kubenswrapper[4802]: E1201 19:59:01.969739 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:59:02.469726985 +0000 UTC m=+164.032286626 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.026262 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-6rtx7" podStartSLOduration=143.026241725 podStartE2EDuration="2m23.026241725s" podCreationTimestamp="2025-12-01 19:56:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:02.013564463 +0000 UTC m=+163.576124104" watchObservedRunningTime="2025-12-01 19:59:02.026241725 +0000 UTC m=+163.588801366" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.045305 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cvht4" podStartSLOduration=143.045290385 podStartE2EDuration="2m23.045290385s" podCreationTimestamp="2025-12-01 19:56:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:02.036532123 +0000 UTC m=+163.599091764" watchObservedRunningTime="2025-12-01 19:59:02.045290385 +0000 UTC m=+163.607850026" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.070733 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:59:02 crc kubenswrapper[4802]: E1201 19:59:02.070899 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:59:02.570875517 +0000 UTC m=+164.133435158 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.070974 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:59:02 crc kubenswrapper[4802]: E1201 19:59:02.071372 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:59:02.571359352 +0000 UTC m=+164.133918993 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.073155 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w2f2s" podStartSLOduration=144.073143797 podStartE2EDuration="2m24.073143797s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:02.070459664 +0000 UTC m=+163.633019305" watchObservedRunningTime="2025-12-01 19:59:02.073143797 +0000 UTC m=+163.635703438" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.102133 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cdzf5" podStartSLOduration=145.102114324 podStartE2EDuration="2m25.102114324s" podCreationTimestamp="2025-12-01 19:56:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:02.097303256 +0000 UTC m=+163.659862917" watchObservedRunningTime="2025-12-01 19:59:02.102114324 +0000 UTC m=+163.664673955" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.131974 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dn99n" podStartSLOduration=144.131956968 podStartE2EDuration="2m24.131956968s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:02.130018278 +0000 UTC m=+163.692577919" watchObservedRunningTime="2025-12-01 19:59:02.131956968 +0000 UTC m=+163.694516609" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.171599 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:59:02 crc kubenswrapper[4802]: E1201 19:59:02.172255 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 19:59:02.672240926 +0000 UTC m=+164.234800567 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.176611 4802 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.258502 4802 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-01T19:59:02.176629341Z","Handler":null,"Name":""} Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.273920 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:59:02 crc kubenswrapper[4802]: E1201 19:59:02.274445 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 19:59:02.774407989 +0000 UTC m=+164.336967710 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zqg5r" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.280726 4802 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.280777 4802 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.374941 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.380015 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.476249 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.481254 4802 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.481292 4802 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.615755 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-4czjq" event={"ID":"18c8b374-04ab-4b76-93d5-dd84b990a54e","Type":"ContainerStarted","Data":"0f32e5b22ba359ea2afa6c41b7764c4effbdb4ccc9ef4a51ff03792fc78720e0"} Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.618036 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p8cs7" event={"ID":"008be62d-2cef-42a3-912f-2b2e58f8e30b","Type":"ContainerStarted","Data":"74146ec2c11a1f955e32c63be37f8b4090606223060438e565946445fdd20bf8"} Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.618093 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p8cs7" event={"ID":"008be62d-2cef-42a3-912f-2b2e58f8e30b","Type":"ContainerStarted","Data":"fd30e6f652de5d40151e9a70a56014910c8806edb2d0d678c6f516986f37cbfe"} Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.620342 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hqg8s" event={"ID":"94fd9d5d-ddb0-4b19-83c1-d8df802aec70","Type":"ContainerStarted","Data":"7d3cd8d020d7f84051c47bbe0b339399b73ee18305ea240c7338ed98a9227a18"} Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.623763 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2pb67" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.702934 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dtjsm"] Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.705497 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dtjsm" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.708358 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.711782 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dtjsm"] Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.733682 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.780064 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zqg5r\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.780741 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b-utilities\") pod \"community-operators-dtjsm\" (UID: \"3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b\") " pod="openshift-marketplace/community-operators-dtjsm" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.780790 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c49j2\" (UniqueName: \"kubernetes.io/projected/3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b-kube-api-access-c49j2\") pod \"community-operators-dtjsm\" (UID: \"3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b\") " pod="openshift-marketplace/community-operators-dtjsm" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.780852 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b-catalog-content\") pod \"community-operators-dtjsm\" (UID: \"3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b\") " pod="openshift-marketplace/community-operators-dtjsm" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.811211 4802 patch_prober.go:28] interesting pod/router-default-5444994796-lvx7b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 19:59:02 crc kubenswrapper[4802]: [-]has-synced failed: reason withheld Dec 01 19:59:02 crc kubenswrapper[4802]: [+]process-running ok Dec 01 19:59:02 crc kubenswrapper[4802]: healthz check failed Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.811401 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvx7b" podUID="7e98194d-958a-4c56-b5a3-90e01eab1816" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.859878 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.882109 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b-utilities\") pod \"community-operators-dtjsm\" (UID: \"3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b\") " pod="openshift-marketplace/community-operators-dtjsm" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.882193 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c49j2\" (UniqueName: \"kubernetes.io/projected/3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b-kube-api-access-c49j2\") pod \"community-operators-dtjsm\" (UID: \"3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b\") " pod="openshift-marketplace/community-operators-dtjsm" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.882260 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b-catalog-content\") pod \"community-operators-dtjsm\" (UID: \"3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b\") " pod="openshift-marketplace/community-operators-dtjsm" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.883018 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b-catalog-content\") pod \"community-operators-dtjsm\" (UID: \"3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b\") " pod="openshift-marketplace/community-operators-dtjsm" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.883257 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b-utilities\") pod \"community-operators-dtjsm\" (UID: \"3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b\") " pod="openshift-marketplace/community-operators-dtjsm" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.909551 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bph4q"] Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.913448 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bph4q" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.914165 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c49j2\" (UniqueName: \"kubernetes.io/projected/3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b-kube-api-access-c49j2\") pod \"community-operators-dtjsm\" (UID: \"3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b\") " pod="openshift-marketplace/community-operators-dtjsm" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.920544 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.932388 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bph4q"] Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.983752 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58m2r\" (UniqueName: \"kubernetes.io/projected/1ef3009d-6227-4034-8325-544c3386a9fd-kube-api-access-58m2r\") pod \"certified-operators-bph4q\" (UID: \"1ef3009d-6227-4034-8325-544c3386a9fd\") " pod="openshift-marketplace/certified-operators-bph4q" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.984119 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ef3009d-6227-4034-8325-544c3386a9fd-utilities\") pod \"certified-operators-bph4q\" (UID: \"1ef3009d-6227-4034-8325-544c3386a9fd\") " pod="openshift-marketplace/certified-operators-bph4q" Dec 01 19:59:02 crc kubenswrapper[4802]: I1201 19:59:02.984192 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ef3009d-6227-4034-8325-544c3386a9fd-catalog-content\") pod \"certified-operators-bph4q\" (UID: \"1ef3009d-6227-4034-8325-544c3386a9fd\") " pod="openshift-marketplace/certified-operators-bph4q" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.024694 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dtjsm" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.085452 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ef3009d-6227-4034-8325-544c3386a9fd-utilities\") pod \"certified-operators-bph4q\" (UID: \"1ef3009d-6227-4034-8325-544c3386a9fd\") " pod="openshift-marketplace/certified-operators-bph4q" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.085530 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ef3009d-6227-4034-8325-544c3386a9fd-catalog-content\") pod \"certified-operators-bph4q\" (UID: \"1ef3009d-6227-4034-8325-544c3386a9fd\") " pod="openshift-marketplace/certified-operators-bph4q" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.085556 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58m2r\" (UniqueName: \"kubernetes.io/projected/1ef3009d-6227-4034-8325-544c3386a9fd-kube-api-access-58m2r\") pod \"certified-operators-bph4q\" (UID: \"1ef3009d-6227-4034-8325-544c3386a9fd\") " pod="openshift-marketplace/certified-operators-bph4q" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.086329 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ef3009d-6227-4034-8325-544c3386a9fd-utilities\") pod \"certified-operators-bph4q\" (UID: \"1ef3009d-6227-4034-8325-544c3386a9fd\") " pod="openshift-marketplace/certified-operators-bph4q" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.086544 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ef3009d-6227-4034-8325-544c3386a9fd-catalog-content\") pod \"certified-operators-bph4q\" (UID: \"1ef3009d-6227-4034-8325-544c3386a9fd\") " pod="openshift-marketplace/certified-operators-bph4q" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.097169 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qvm7z"] Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.098275 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qvm7z" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.105602 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58m2r\" (UniqueName: \"kubernetes.io/projected/1ef3009d-6227-4034-8325-544c3386a9fd-kube-api-access-58m2r\") pod \"certified-operators-bph4q\" (UID: \"1ef3009d-6227-4034-8325-544c3386a9fd\") " pod="openshift-marketplace/certified-operators-bph4q" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.112106 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qvm7z"] Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.185673 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zqg5r"] Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.186224 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/225bc123-af5b-421f-b8e4-a3a13ce3ee58-utilities\") pod \"community-operators-qvm7z\" (UID: \"225bc123-af5b-421f-b8e4-a3a13ce3ee58\") " pod="openshift-marketplace/community-operators-qvm7z" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.186269 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/225bc123-af5b-421f-b8e4-a3a13ce3ee58-catalog-content\") pod \"community-operators-qvm7z\" (UID: \"225bc123-af5b-421f-b8e4-a3a13ce3ee58\") " pod="openshift-marketplace/community-operators-qvm7z" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.186347 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh59m\" (UniqueName: \"kubernetes.io/projected/225bc123-af5b-421f-b8e4-a3a13ce3ee58-kube-api-access-gh59m\") pod \"community-operators-qvm7z\" (UID: \"225bc123-af5b-421f-b8e4-a3a13ce3ee58\") " pod="openshift-marketplace/community-operators-qvm7z" Dec 01 19:59:03 crc kubenswrapper[4802]: W1201 19:59:03.205361 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode59b4a8c_e6d3_4b2b_900f_0098c1f863f3.slice/crio-7d01374a6bed982cef7a09f5c25dcc57dfb8cbd9e86414dbda92b02a1219e31a WatchSource:0}: Error finding container 7d01374a6bed982cef7a09f5c25dcc57dfb8cbd9e86414dbda92b02a1219e31a: Status 404 returned error can't find the container with id 7d01374a6bed982cef7a09f5c25dcc57dfb8cbd9e86414dbda92b02a1219e31a Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.242685 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bph4q" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.287536 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh59m\" (UniqueName: \"kubernetes.io/projected/225bc123-af5b-421f-b8e4-a3a13ce3ee58-kube-api-access-gh59m\") pod \"community-operators-qvm7z\" (UID: \"225bc123-af5b-421f-b8e4-a3a13ce3ee58\") " pod="openshift-marketplace/community-operators-qvm7z" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.287604 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/225bc123-af5b-421f-b8e4-a3a13ce3ee58-utilities\") pod \"community-operators-qvm7z\" (UID: \"225bc123-af5b-421f-b8e4-a3a13ce3ee58\") " pod="openshift-marketplace/community-operators-qvm7z" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.287629 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/225bc123-af5b-421f-b8e4-a3a13ce3ee58-catalog-content\") pod \"community-operators-qvm7z\" (UID: \"225bc123-af5b-421f-b8e4-a3a13ce3ee58\") " pod="openshift-marketplace/community-operators-qvm7z" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.288086 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/225bc123-af5b-421f-b8e4-a3a13ce3ee58-catalog-content\") pod \"community-operators-qvm7z\" (UID: \"225bc123-af5b-421f-b8e4-a3a13ce3ee58\") " pod="openshift-marketplace/community-operators-qvm7z" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.288587 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/225bc123-af5b-421f-b8e4-a3a13ce3ee58-utilities\") pod \"community-operators-qvm7z\" (UID: \"225bc123-af5b-421f-b8e4-a3a13ce3ee58\") " pod="openshift-marketplace/community-operators-qvm7z" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.291247 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-54h5m"] Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.292221 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-54h5m" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.303037 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-54h5m"] Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.305399 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh59m\" (UniqueName: \"kubernetes.io/projected/225bc123-af5b-421f-b8e4-a3a13ce3ee58-kube-api-access-gh59m\") pod \"community-operators-qvm7z\" (UID: \"225bc123-af5b-421f-b8e4-a3a13ce3ee58\") " pod="openshift-marketplace/community-operators-qvm7z" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.321079 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dtjsm"] Dec 01 19:59:03 crc kubenswrapper[4802]: W1201 19:59:03.335956 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f0c2e2b_e4b4_4733_831f_1a4f3e90b57b.slice/crio-b3cdd24c2dbdc1087a9199b1955ab96f8dff0157c6862a25ddec8109311bd4cf WatchSource:0}: Error finding container b3cdd24c2dbdc1087a9199b1955ab96f8dff0157c6862a25ddec8109311bd4cf: Status 404 returned error can't find the container with id b3cdd24c2dbdc1087a9199b1955ab96f8dff0157c6862a25ddec8109311bd4cf Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.389825 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edeba60a-7f88-48b8-a016-324a7527a666-utilities\") pod \"certified-operators-54h5m\" (UID: \"edeba60a-7f88-48b8-a016-324a7527a666\") " pod="openshift-marketplace/certified-operators-54h5m" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.390479 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n56l\" (UniqueName: \"kubernetes.io/projected/edeba60a-7f88-48b8-a016-324a7527a666-kube-api-access-9n56l\") pod \"certified-operators-54h5m\" (UID: \"edeba60a-7f88-48b8-a016-324a7527a666\") " pod="openshift-marketplace/certified-operators-54h5m" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.390521 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edeba60a-7f88-48b8-a016-324a7527a666-catalog-content\") pod \"certified-operators-54h5m\" (UID: \"edeba60a-7f88-48b8-a016-324a7527a666\") " pod="openshift-marketplace/certified-operators-54h5m" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.421175 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qvm7z" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.489601 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bph4q"] Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.491851 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n56l\" (UniqueName: \"kubernetes.io/projected/edeba60a-7f88-48b8-a016-324a7527a666-kube-api-access-9n56l\") pod \"certified-operators-54h5m\" (UID: \"edeba60a-7f88-48b8-a016-324a7527a666\") " pod="openshift-marketplace/certified-operators-54h5m" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.491908 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edeba60a-7f88-48b8-a016-324a7527a666-catalog-content\") pod \"certified-operators-54h5m\" (UID: \"edeba60a-7f88-48b8-a016-324a7527a666\") " pod="openshift-marketplace/certified-operators-54h5m" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.491941 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edeba60a-7f88-48b8-a016-324a7527a666-utilities\") pod \"certified-operators-54h5m\" (UID: \"edeba60a-7f88-48b8-a016-324a7527a666\") " pod="openshift-marketplace/certified-operators-54h5m" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.492449 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edeba60a-7f88-48b8-a016-324a7527a666-utilities\") pod \"certified-operators-54h5m\" (UID: \"edeba60a-7f88-48b8-a016-324a7527a666\") " pod="openshift-marketplace/certified-operators-54h5m" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.492695 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edeba60a-7f88-48b8-a016-324a7527a666-catalog-content\") pod \"certified-operators-54h5m\" (UID: \"edeba60a-7f88-48b8-a016-324a7527a666\") " pod="openshift-marketplace/certified-operators-54h5m" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.513009 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n56l\" (UniqueName: \"kubernetes.io/projected/edeba60a-7f88-48b8-a016-324a7527a666-kube-api-access-9n56l\") pod \"certified-operators-54h5m\" (UID: \"edeba60a-7f88-48b8-a016-324a7527a666\") " pod="openshift-marketplace/certified-operators-54h5m" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.611525 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-54h5m" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.697778 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qvm7z"] Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.698418 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p8cs7" event={"ID":"008be62d-2cef-42a3-912f-2b2e58f8e30b","Type":"ContainerStarted","Data":"f414fdd9c149ebe05a4676aaa10d8d1c8a9817ac0f97a55704d577df5f8c2d2c"} Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.733574 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bph4q" event={"ID":"1ef3009d-6227-4034-8325-544c3386a9fd","Type":"ContainerStarted","Data":"f30cd507fc830616a5d986483428a388b3ef834d9469422fdb786ec925f1ef3d"} Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.749630 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-p8cs7" podStartSLOduration=145.749610438 podStartE2EDuration="2m25.749610438s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:03.741583118 +0000 UTC m=+165.304142759" watchObservedRunningTime="2025-12-01 19:59:03.749610438 +0000 UTC m=+165.312170079" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.752507 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hqg8s" event={"ID":"94fd9d5d-ddb0-4b19-83c1-d8df802aec70","Type":"ContainerStarted","Data":"976c05abd9011e0274f3ff721d6a5aee64560d0bbf4e2b5cba9b8731b91e824f"} Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.752562 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hqg8s" event={"ID":"94fd9d5d-ddb0-4b19-83c1-d8df802aec70","Type":"ContainerStarted","Data":"a824e8883e503c8e10bac72dcfd80a3584ce331e5ea80962b531c80b804e68a9"} Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.755020 4802 generic.go:334] "Generic (PLEG): container finished" podID="3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b" containerID="a30774b380c596a09ae08f4c36a64624a4a1af505e3cc5981980a43710a96c2c" exitCode=0 Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.755079 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtjsm" event={"ID":"3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b","Type":"ContainerDied","Data":"a30774b380c596a09ae08f4c36a64624a4a1af505e3cc5981980a43710a96c2c"} Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.755122 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtjsm" event={"ID":"3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b","Type":"ContainerStarted","Data":"b3cdd24c2dbdc1087a9199b1955ab96f8dff0157c6862a25ddec8109311bd4cf"} Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.764631 4802 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.781590 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-4czjq" event={"ID":"18c8b374-04ab-4b76-93d5-dd84b990a54e","Type":"ContainerStarted","Data":"60b5dc55c78db075ca69e0878925a3441fc24abe3f3c286cb37ea78c58884b59"} Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.796300 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-hqg8s" podStartSLOduration=10.796281203 podStartE2EDuration="10.796281203s" podCreationTimestamp="2025-12-01 19:58:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:03.7949138 +0000 UTC m=+165.357473451" watchObservedRunningTime="2025-12-01 19:59:03.796281203 +0000 UTC m=+165.358840844" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.802569 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" event={"ID":"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3","Type":"ContainerStarted","Data":"934721271d36aeebf7199d20721d8d0eb114e8be073cb854d67067c6b5b12a09"} Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.802821 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" event={"ID":"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3","Type":"ContainerStarted","Data":"7d01374a6bed982cef7a09f5c25dcc57dfb8cbd9e86414dbda92b02a1219e31a"} Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.802835 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.814355 4802 patch_prober.go:28] interesting pod/router-default-5444994796-lvx7b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 19:59:03 crc kubenswrapper[4802]: [-]has-synced failed: reason withheld Dec 01 19:59:03 crc kubenswrapper[4802]: [+]process-running ok Dec 01 19:59:03 crc kubenswrapper[4802]: healthz check failed Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.814401 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvx7b" podUID="7e98194d-958a-4c56-b5a3-90e01eab1816" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 19:59:03 crc kubenswrapper[4802]: I1201 19:59:03.961323 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-4czjq" podStartSLOduration=145.961303842 podStartE2EDuration="2m25.961303842s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:03.909618401 +0000 UTC m=+165.472178062" watchObservedRunningTime="2025-12-01 19:59:03.961303842 +0000 UTC m=+165.523863483" Dec 01 19:59:04 crc kubenswrapper[4802]: I1201 19:59:04.069386 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" podStartSLOduration=146.069366638 podStartE2EDuration="2m26.069366638s" podCreationTimestamp="2025-12-01 19:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:03.96994361 +0000 UTC m=+165.532503251" watchObservedRunningTime="2025-12-01 19:59:04.069366638 +0000 UTC m=+165.631926279" Dec 01 19:59:04 crc kubenswrapper[4802]: I1201 19:59:04.071059 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-54h5m"] Dec 01 19:59:04 crc kubenswrapper[4802]: I1201 19:59:04.709521 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 19:59:04 crc kubenswrapper[4802]: I1201 19:59:04.710570 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 19:59:04 crc kubenswrapper[4802]: I1201 19:59:04.712738 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 01 19:59:04 crc kubenswrapper[4802]: I1201 19:59:04.714591 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 01 19:59:04 crc kubenswrapper[4802]: I1201 19:59:04.718350 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 19:59:04 crc kubenswrapper[4802]: I1201 19:59:04.808347 4802 generic.go:334] "Generic (PLEG): container finished" podID="1ef3009d-6227-4034-8325-544c3386a9fd" containerID="c71015f776d0c2715eb53297e668dfa9ae133de4f484daa7fde1b795280dcb77" exitCode=0 Dec 01 19:59:04 crc kubenswrapper[4802]: I1201 19:59:04.808387 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bph4q" event={"ID":"1ef3009d-6227-4034-8325-544c3386a9fd","Type":"ContainerDied","Data":"c71015f776d0c2715eb53297e668dfa9ae133de4f484daa7fde1b795280dcb77"} Dec 01 19:59:04 crc kubenswrapper[4802]: I1201 19:59:04.809454 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/983c4f79-5c28-4f5c-8c4b-e876ad0d7e00-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"983c4f79-5c28-4f5c-8c4b-e876ad0d7e00\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 19:59:04 crc kubenswrapper[4802]: I1201 19:59:04.809512 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/983c4f79-5c28-4f5c-8c4b-e876ad0d7e00-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"983c4f79-5c28-4f5c-8c4b-e876ad0d7e00\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 19:59:04 crc kubenswrapper[4802]: I1201 19:59:04.810136 4802 patch_prober.go:28] interesting pod/router-default-5444994796-lvx7b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 19:59:04 crc kubenswrapper[4802]: [-]has-synced failed: reason withheld Dec 01 19:59:04 crc kubenswrapper[4802]: [+]process-running ok Dec 01 19:59:04 crc kubenswrapper[4802]: healthz check failed Dec 01 19:59:04 crc kubenswrapper[4802]: I1201 19:59:04.810164 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvx7b" podUID="7e98194d-958a-4c56-b5a3-90e01eab1816" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 19:59:04 crc kubenswrapper[4802]: I1201 19:59:04.811345 4802 generic.go:334] "Generic (PLEG): container finished" podID="edeba60a-7f88-48b8-a016-324a7527a666" containerID="93b6c4b5aac91ea67cd4b1d3e3c5dd4b111e8dc0fc60ec0d4a583bd315a9b2b1" exitCode=0 Dec 01 19:59:04 crc kubenswrapper[4802]: I1201 19:59:04.811426 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54h5m" event={"ID":"edeba60a-7f88-48b8-a016-324a7527a666","Type":"ContainerDied","Data":"93b6c4b5aac91ea67cd4b1d3e3c5dd4b111e8dc0fc60ec0d4a583bd315a9b2b1"} Dec 01 19:59:04 crc kubenswrapper[4802]: I1201 19:59:04.811465 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54h5m" event={"ID":"edeba60a-7f88-48b8-a016-324a7527a666","Type":"ContainerStarted","Data":"114cf08cddfd3ba08b7f0c3f3819be251a199fb2cbf5cfd9956e6ff7811da883"} Dec 01 19:59:04 crc kubenswrapper[4802]: I1201 19:59:04.813839 4802 generic.go:334] "Generic (PLEG): container finished" podID="225bc123-af5b-421f-b8e4-a3a13ce3ee58" containerID="67c29fdccf17355015b44bd47bf8e4a5595a74382866ce1f2ec4003afb6f14b2" exitCode=0 Dec 01 19:59:04 crc kubenswrapper[4802]: I1201 19:59:04.813896 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qvm7z" event={"ID":"225bc123-af5b-421f-b8e4-a3a13ce3ee58","Type":"ContainerDied","Data":"67c29fdccf17355015b44bd47bf8e4a5595a74382866ce1f2ec4003afb6f14b2"} Dec 01 19:59:04 crc kubenswrapper[4802]: I1201 19:59:04.813921 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qvm7z" event={"ID":"225bc123-af5b-421f-b8e4-a3a13ce3ee58","Type":"ContainerStarted","Data":"037873669d53b3167762ad610934e99145230631e8ecc9217d070908c767ddf1"} Dec 01 19:59:04 crc kubenswrapper[4802]: I1201 19:59:04.893882 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k9p7x"] Dec 01 19:59:04 crc kubenswrapper[4802]: I1201 19:59:04.897535 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k9p7x" Dec 01 19:59:04 crc kubenswrapper[4802]: I1201 19:59:04.899717 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k9p7x"] Dec 01 19:59:04 crc kubenswrapper[4802]: I1201 19:59:04.902272 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 19:59:04 crc kubenswrapper[4802]: I1201 19:59:04.910296 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/983c4f79-5c28-4f5c-8c4b-e876ad0d7e00-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"983c4f79-5c28-4f5c-8c4b-e876ad0d7e00\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 19:59:04 crc kubenswrapper[4802]: I1201 19:59:04.910460 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/983c4f79-5c28-4f5c-8c4b-e876ad0d7e00-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"983c4f79-5c28-4f5c-8c4b-e876ad0d7e00\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 19:59:04 crc kubenswrapper[4802]: I1201 19:59:04.910571 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/983c4f79-5c28-4f5c-8c4b-e876ad0d7e00-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"983c4f79-5c28-4f5c-8c4b-e876ad0d7e00\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 19:59:04 crc kubenswrapper[4802]: I1201 19:59:04.931691 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/983c4f79-5c28-4f5c-8c4b-e876ad0d7e00-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"983c4f79-5c28-4f5c-8c4b-e876ad0d7e00\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.011540 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4f1a65e-833b-4450-a3b1-7d4f67f05fb5-catalog-content\") pod \"redhat-marketplace-k9p7x\" (UID: \"d4f1a65e-833b-4450-a3b1-7d4f67f05fb5\") " pod="openshift-marketplace/redhat-marketplace-k9p7x" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.011602 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4f1a65e-833b-4450-a3b1-7d4f67f05fb5-utilities\") pod \"redhat-marketplace-k9p7x\" (UID: \"d4f1a65e-833b-4450-a3b1-7d4f67f05fb5\") " pod="openshift-marketplace/redhat-marketplace-k9p7x" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.011661 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwszf\" (UniqueName: \"kubernetes.io/projected/d4f1a65e-833b-4450-a3b1-7d4f67f05fb5-kube-api-access-mwszf\") pod \"redhat-marketplace-k9p7x\" (UID: \"d4f1a65e-833b-4450-a3b1-7d4f67f05fb5\") " pod="openshift-marketplace/redhat-marketplace-k9p7x" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.025311 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.112788 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4f1a65e-833b-4450-a3b1-7d4f67f05fb5-catalog-content\") pod \"redhat-marketplace-k9p7x\" (UID: \"d4f1a65e-833b-4450-a3b1-7d4f67f05fb5\") " pod="openshift-marketplace/redhat-marketplace-k9p7x" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.112834 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4f1a65e-833b-4450-a3b1-7d4f67f05fb5-utilities\") pod \"redhat-marketplace-k9p7x\" (UID: \"d4f1a65e-833b-4450-a3b1-7d4f67f05fb5\") " pod="openshift-marketplace/redhat-marketplace-k9p7x" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.112865 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwszf\" (UniqueName: \"kubernetes.io/projected/d4f1a65e-833b-4450-a3b1-7d4f67f05fb5-kube-api-access-mwszf\") pod \"redhat-marketplace-k9p7x\" (UID: \"d4f1a65e-833b-4450-a3b1-7d4f67f05fb5\") " pod="openshift-marketplace/redhat-marketplace-k9p7x" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.113403 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4f1a65e-833b-4450-a3b1-7d4f67f05fb5-catalog-content\") pod \"redhat-marketplace-k9p7x\" (UID: \"d4f1a65e-833b-4450-a3b1-7d4f67f05fb5\") " pod="openshift-marketplace/redhat-marketplace-k9p7x" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.113542 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4f1a65e-833b-4450-a3b1-7d4f67f05fb5-utilities\") pod \"redhat-marketplace-k9p7x\" (UID: \"d4f1a65e-833b-4450-a3b1-7d4f67f05fb5\") " pod="openshift-marketplace/redhat-marketplace-k9p7x" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.132517 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwszf\" (UniqueName: \"kubernetes.io/projected/d4f1a65e-833b-4450-a3b1-7d4f67f05fb5-kube-api-access-mwszf\") pod \"redhat-marketplace-k9p7x\" (UID: \"d4f1a65e-833b-4450-a3b1-7d4f67f05fb5\") " pod="openshift-marketplace/redhat-marketplace-k9p7x" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.225592 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k9p7x" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.229251 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 19:59:05 crc kubenswrapper[4802]: W1201 19:59:05.258457 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod983c4f79_5c28_4f5c_8c4b_e876ad0d7e00.slice/crio-a1b11dcdbdba25430e8d309a54d67c06e5c83d05f91f355fdc223889f263eb96 WatchSource:0}: Error finding container a1b11dcdbdba25430e8d309a54d67c06e5c83d05f91f355fdc223889f263eb96: Status 404 returned error can't find the container with id a1b11dcdbdba25430e8d309a54d67c06e5c83d05f91f355fdc223889f263eb96 Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.292432 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jkqxz"] Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.293884 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jkqxz" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.303013 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jkqxz"] Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.418337 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z65l\" (UniqueName: \"kubernetes.io/projected/f4850b32-033b-451d-af1d-fad64baead63-kube-api-access-9z65l\") pod \"redhat-marketplace-jkqxz\" (UID: \"f4850b32-033b-451d-af1d-fad64baead63\") " pod="openshift-marketplace/redhat-marketplace-jkqxz" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.418742 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4850b32-033b-451d-af1d-fad64baead63-catalog-content\") pod \"redhat-marketplace-jkqxz\" (UID: \"f4850b32-033b-451d-af1d-fad64baead63\") " pod="openshift-marketplace/redhat-marketplace-jkqxz" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.418773 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4850b32-033b-451d-af1d-fad64baead63-utilities\") pod \"redhat-marketplace-jkqxz\" (UID: \"f4850b32-033b-451d-af1d-fad64baead63\") " pod="openshift-marketplace/redhat-marketplace-jkqxz" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.453750 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k9p7x"] Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.520987 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z65l\" (UniqueName: \"kubernetes.io/projected/f4850b32-033b-451d-af1d-fad64baead63-kube-api-access-9z65l\") pod \"redhat-marketplace-jkqxz\" (UID: \"f4850b32-033b-451d-af1d-fad64baead63\") " pod="openshift-marketplace/redhat-marketplace-jkqxz" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.521098 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4850b32-033b-451d-af1d-fad64baead63-catalog-content\") pod \"redhat-marketplace-jkqxz\" (UID: \"f4850b32-033b-451d-af1d-fad64baead63\") " pod="openshift-marketplace/redhat-marketplace-jkqxz" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.521142 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4850b32-033b-451d-af1d-fad64baead63-utilities\") pod \"redhat-marketplace-jkqxz\" (UID: \"f4850b32-033b-451d-af1d-fad64baead63\") " pod="openshift-marketplace/redhat-marketplace-jkqxz" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.524478 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4850b32-033b-451d-af1d-fad64baead63-utilities\") pod \"redhat-marketplace-jkqxz\" (UID: \"f4850b32-033b-451d-af1d-fad64baead63\") " pod="openshift-marketplace/redhat-marketplace-jkqxz" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.524657 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4850b32-033b-451d-af1d-fad64baead63-catalog-content\") pod \"redhat-marketplace-jkqxz\" (UID: \"f4850b32-033b-451d-af1d-fad64baead63\") " pod="openshift-marketplace/redhat-marketplace-jkqxz" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.543907 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z65l\" (UniqueName: \"kubernetes.io/projected/f4850b32-033b-451d-af1d-fad64baead63-kube-api-access-9z65l\") pod \"redhat-marketplace-jkqxz\" (UID: \"f4850b32-033b-451d-af1d-fad64baead63\") " pod="openshift-marketplace/redhat-marketplace-jkqxz" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.617573 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jkqxz" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.810225 4802 patch_prober.go:28] interesting pod/router-default-5444994796-lvx7b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 19:59:05 crc kubenswrapper[4802]: [-]has-synced failed: reason withheld Dec 01 19:59:05 crc kubenswrapper[4802]: [+]process-running ok Dec 01 19:59:05 crc kubenswrapper[4802]: healthz check failed Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.810465 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvx7b" podUID="7e98194d-958a-4c56-b5a3-90e01eab1816" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.822988 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"983c4f79-5c28-4f5c-8c4b-e876ad0d7e00","Type":"ContainerStarted","Data":"aeeb78c6912ceb62d564ef293233565bb3e9af247a4bcc550577483dbb05001f"} Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.823040 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"983c4f79-5c28-4f5c-8c4b-e876ad0d7e00","Type":"ContainerStarted","Data":"a1b11dcdbdba25430e8d309a54d67c06e5c83d05f91f355fdc223889f263eb96"} Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.825059 4802 generic.go:334] "Generic (PLEG): container finished" podID="081dccc6-dbee-40a9-8333-d1c178e3fab3" containerID="14ba7088961527c5c9887e5909d2898863b4b063e7bb01065f35614ce104b714" exitCode=0 Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.825115 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410305-db4dv" event={"ID":"081dccc6-dbee-40a9-8333-d1c178e3fab3","Type":"ContainerDied","Data":"14ba7088961527c5c9887e5909d2898863b4b063e7bb01065f35614ce104b714"} Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.826659 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k9p7x" event={"ID":"d4f1a65e-833b-4450-a3b1-7d4f67f05fb5","Type":"ContainerDied","Data":"5682fdc797da598d1b28821e01bb528ea3345f8e31232802248320950842c3e3"} Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.827063 4802 generic.go:334] "Generic (PLEG): container finished" podID="d4f1a65e-833b-4450-a3b1-7d4f67f05fb5" containerID="5682fdc797da598d1b28821e01bb528ea3345f8e31232802248320950842c3e3" exitCode=0 Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.827147 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k9p7x" event={"ID":"d4f1a65e-833b-4450-a3b1-7d4f67f05fb5","Type":"ContainerStarted","Data":"08917d33458c7eeca5a0dbb57719791023cd11df7d4c3c970d16fecfc8495d10"} Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.836218 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jkqxz"] Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.838501 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.838480766 podStartE2EDuration="1.838480766s" podCreationTimestamp="2025-12-01 19:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:05.835596737 +0000 UTC m=+167.398156378" watchObservedRunningTime="2025-12-01 19:59:05.838480766 +0000 UTC m=+167.401040407" Dec 01 19:59:05 crc kubenswrapper[4802]: W1201 19:59:05.854286 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4850b32_033b_451d_af1d_fad64baead63.slice/crio-fdad27d6cfbc5899c9045dcb2de2da6544d188c7233c6dbdcf117c629ab92a7f WatchSource:0}: Error finding container fdad27d6cfbc5899c9045dcb2de2da6544d188c7233c6dbdcf117c629ab92a7f: Status 404 returned error can't find the container with id fdad27d6cfbc5899c9045dcb2de2da6544d188c7233c6dbdcf117c629ab92a7f Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.886985 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.887349 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.889662 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c6vvc"] Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.891208 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c6vvc" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.894408 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.895902 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:59:05 crc kubenswrapper[4802]: I1201 19:59:05.898893 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c6vvc"] Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.030423 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f8519d9-5b33-4c4d-b430-6497b8bcc71b-catalog-content\") pod \"redhat-operators-c6vvc\" (UID: \"2f8519d9-5b33-4c4d-b430-6497b8bcc71b\") " pod="openshift-marketplace/redhat-operators-c6vvc" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.031553 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f8519d9-5b33-4c4d-b430-6497b8bcc71b-utilities\") pod \"redhat-operators-c6vvc\" (UID: \"2f8519d9-5b33-4c4d-b430-6497b8bcc71b\") " pod="openshift-marketplace/redhat-operators-c6vvc" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.031709 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndhmb\" (UniqueName: \"kubernetes.io/projected/2f8519d9-5b33-4c4d-b430-6497b8bcc71b-kube-api-access-ndhmb\") pod \"redhat-operators-c6vvc\" (UID: \"2f8519d9-5b33-4c4d-b430-6497b8bcc71b\") " pod="openshift-marketplace/redhat-operators-c6vvc" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.072611 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-gjlgw" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.133078 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f8519d9-5b33-4c4d-b430-6497b8bcc71b-catalog-content\") pod \"redhat-operators-c6vvc\" (UID: \"2f8519d9-5b33-4c4d-b430-6497b8bcc71b\") " pod="openshift-marketplace/redhat-operators-c6vvc" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.133484 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f8519d9-5b33-4c4d-b430-6497b8bcc71b-utilities\") pod \"redhat-operators-c6vvc\" (UID: \"2f8519d9-5b33-4c4d-b430-6497b8bcc71b\") " pod="openshift-marketplace/redhat-operators-c6vvc" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.133531 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndhmb\" (UniqueName: \"kubernetes.io/projected/2f8519d9-5b33-4c4d-b430-6497b8bcc71b-kube-api-access-ndhmb\") pod \"redhat-operators-c6vvc\" (UID: \"2f8519d9-5b33-4c4d-b430-6497b8bcc71b\") " pod="openshift-marketplace/redhat-operators-c6vvc" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.133886 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f8519d9-5b33-4c4d-b430-6497b8bcc71b-catalog-content\") pod \"redhat-operators-c6vvc\" (UID: \"2f8519d9-5b33-4c4d-b430-6497b8bcc71b\") " pod="openshift-marketplace/redhat-operators-c6vvc" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.134433 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f8519d9-5b33-4c4d-b430-6497b8bcc71b-utilities\") pod \"redhat-operators-c6vvc\" (UID: \"2f8519d9-5b33-4c4d-b430-6497b8bcc71b\") " pod="openshift-marketplace/redhat-operators-c6vvc" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.158136 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndhmb\" (UniqueName: \"kubernetes.io/projected/2f8519d9-5b33-4c4d-b430-6497b8bcc71b-kube-api-access-ndhmb\") pod \"redhat-operators-c6vvc\" (UID: \"2f8519d9-5b33-4c4d-b430-6497b8bcc71b\") " pod="openshift-marketplace/redhat-operators-c6vvc" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.224447 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c6vvc" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.252072 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-62kxj" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.252122 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-62kxj" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.254173 4802 patch_prober.go:28] interesting pod/console-f9d7485db-62kxj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.254236 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-62kxj" podUID="9f3283e5-38b2-4f3e-a0d4-122b734e79d4" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.295124 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z2tv2"] Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.296345 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z2tv2" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.314589 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z2tv2"] Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.439432 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee24ae84-374a-46a9-8816-59086ecab84e-utilities\") pod \"redhat-operators-z2tv2\" (UID: \"ee24ae84-374a-46a9-8816-59086ecab84e\") " pod="openshift-marketplace/redhat-operators-z2tv2" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.439516 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee24ae84-374a-46a9-8816-59086ecab84e-catalog-content\") pod \"redhat-operators-z2tv2\" (UID: \"ee24ae84-374a-46a9-8816-59086ecab84e\") " pod="openshift-marketplace/redhat-operators-z2tv2" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.439541 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dnd8\" (UniqueName: \"kubernetes.io/projected/ee24ae84-374a-46a9-8816-59086ecab84e-kube-api-access-2dnd8\") pod \"redhat-operators-z2tv2\" (UID: \"ee24ae84-374a-46a9-8816-59086ecab84e\") " pod="openshift-marketplace/redhat-operators-z2tv2" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.447439 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c6vvc"] Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.541096 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee24ae84-374a-46a9-8816-59086ecab84e-catalog-content\") pod \"redhat-operators-z2tv2\" (UID: \"ee24ae84-374a-46a9-8816-59086ecab84e\") " pod="openshift-marketplace/redhat-operators-z2tv2" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.541153 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dnd8\" (UniqueName: \"kubernetes.io/projected/ee24ae84-374a-46a9-8816-59086ecab84e-kube-api-access-2dnd8\") pod \"redhat-operators-z2tv2\" (UID: \"ee24ae84-374a-46a9-8816-59086ecab84e\") " pod="openshift-marketplace/redhat-operators-z2tv2" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.541269 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee24ae84-374a-46a9-8816-59086ecab84e-utilities\") pod \"redhat-operators-z2tv2\" (UID: \"ee24ae84-374a-46a9-8816-59086ecab84e\") " pod="openshift-marketplace/redhat-operators-z2tv2" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.541677 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee24ae84-374a-46a9-8816-59086ecab84e-utilities\") pod \"redhat-operators-z2tv2\" (UID: \"ee24ae84-374a-46a9-8816-59086ecab84e\") " pod="openshift-marketplace/redhat-operators-z2tv2" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.546262 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee24ae84-374a-46a9-8816-59086ecab84e-catalog-content\") pod \"redhat-operators-z2tv2\" (UID: \"ee24ae84-374a-46a9-8816-59086ecab84e\") " pod="openshift-marketplace/redhat-operators-z2tv2" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.561784 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dnd8\" (UniqueName: \"kubernetes.io/projected/ee24ae84-374a-46a9-8816-59086ecab84e-kube-api-access-2dnd8\") pod \"redhat-operators-z2tv2\" (UID: \"ee24ae84-374a-46a9-8816-59086ecab84e\") " pod="openshift-marketplace/redhat-operators-z2tv2" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.615535 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z2tv2" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.743661 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.743690 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.750938 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.809531 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-lvx7b" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.811158 4802 patch_prober.go:28] interesting pod/router-default-5444994796-lvx7b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 19:59:06 crc kubenswrapper[4802]: [-]has-synced failed: reason withheld Dec 01 19:59:06 crc kubenswrapper[4802]: [+]process-running ok Dec 01 19:59:06 crc kubenswrapper[4802]: healthz check failed Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.811261 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvx7b" podUID="7e98194d-958a-4c56-b5a3-90e01eab1816" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.836965 4802 generic.go:334] "Generic (PLEG): container finished" podID="2f8519d9-5b33-4c4d-b430-6497b8bcc71b" containerID="dff4b0a63cec243c64d35c22bf2e4bedab08fe65c4cd04d20b67a7e8905d1fce" exitCode=0 Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.837307 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6vvc" event={"ID":"2f8519d9-5b33-4c4d-b430-6497b8bcc71b","Type":"ContainerDied","Data":"dff4b0a63cec243c64d35c22bf2e4bedab08fe65c4cd04d20b67a7e8905d1fce"} Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.837376 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6vvc" event={"ID":"2f8519d9-5b33-4c4d-b430-6497b8bcc71b","Type":"ContainerStarted","Data":"3775abdab8611631ca62e02c0f19059b5fc373aa000b9fbd2bd189f1e4fef510"} Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.852565 4802 generic.go:334] "Generic (PLEG): container finished" podID="f4850b32-033b-451d-af1d-fad64baead63" containerID="de48f2e40f103e27b77d9feb8f8b29a5485babcb008e1249f5e4093ee4749e6d" exitCode=0 Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.852792 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jkqxz" event={"ID":"f4850b32-033b-451d-af1d-fad64baead63","Type":"ContainerDied","Data":"de48f2e40f103e27b77d9feb8f8b29a5485babcb008e1249f5e4093ee4749e6d"} Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.852834 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jkqxz" event={"ID":"f4850b32-033b-451d-af1d-fad64baead63","Type":"ContainerStarted","Data":"fdad27d6cfbc5899c9045dcb2de2da6544d188c7233c6dbdcf117c629ab92a7f"} Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.860593 4802 generic.go:334] "Generic (PLEG): container finished" podID="983c4f79-5c28-4f5c-8c4b-e876ad0d7e00" containerID="aeeb78c6912ceb62d564ef293233565bb3e9af247a4bcc550577483dbb05001f" exitCode=0 Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.860703 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"983c4f79-5c28-4f5c-8c4b-e876ad0d7e00","Type":"ContainerDied","Data":"aeeb78c6912ceb62d564ef293233565bb3e9af247a4bcc550577483dbb05001f"} Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.866137 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dgmgt" Dec 01 19:59:06 crc kubenswrapper[4802]: I1201 19:59:06.866960 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-4czjq" Dec 01 19:59:07 crc kubenswrapper[4802]: I1201 19:59:07.078330 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z2tv2"] Dec 01 19:59:07 crc kubenswrapper[4802]: W1201 19:59:07.111813 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee24ae84_374a_46a9_8816_59086ecab84e.slice/crio-36f815ed245c4973ba57617035ef9417d6afda251881b767db85dc31ab1a93d8 WatchSource:0}: Error finding container 36f815ed245c4973ba57617035ef9417d6afda251881b767db85dc31ab1a93d8: Status 404 returned error can't find the container with id 36f815ed245c4973ba57617035ef9417d6afda251881b767db85dc31ab1a93d8 Dec 01 19:59:07 crc kubenswrapper[4802]: I1201 19:59:07.258305 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410305-db4dv" Dec 01 19:59:07 crc kubenswrapper[4802]: I1201 19:59:07.359837 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/081dccc6-dbee-40a9-8333-d1c178e3fab3-config-volume\") pod \"081dccc6-dbee-40a9-8333-d1c178e3fab3\" (UID: \"081dccc6-dbee-40a9-8333-d1c178e3fab3\") " Dec 01 19:59:07 crc kubenswrapper[4802]: I1201 19:59:07.359934 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6cp5\" (UniqueName: \"kubernetes.io/projected/081dccc6-dbee-40a9-8333-d1c178e3fab3-kube-api-access-x6cp5\") pod \"081dccc6-dbee-40a9-8333-d1c178e3fab3\" (UID: \"081dccc6-dbee-40a9-8333-d1c178e3fab3\") " Dec 01 19:59:07 crc kubenswrapper[4802]: I1201 19:59:07.359974 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/081dccc6-dbee-40a9-8333-d1c178e3fab3-secret-volume\") pod \"081dccc6-dbee-40a9-8333-d1c178e3fab3\" (UID: \"081dccc6-dbee-40a9-8333-d1c178e3fab3\") " Dec 01 19:59:07 crc kubenswrapper[4802]: I1201 19:59:07.360666 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/081dccc6-dbee-40a9-8333-d1c178e3fab3-config-volume" (OuterVolumeSpecName: "config-volume") pod "081dccc6-dbee-40a9-8333-d1c178e3fab3" (UID: "081dccc6-dbee-40a9-8333-d1c178e3fab3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 19:59:07 crc kubenswrapper[4802]: I1201 19:59:07.365538 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/081dccc6-dbee-40a9-8333-d1c178e3fab3-kube-api-access-x6cp5" (OuterVolumeSpecName: "kube-api-access-x6cp5") pod "081dccc6-dbee-40a9-8333-d1c178e3fab3" (UID: "081dccc6-dbee-40a9-8333-d1c178e3fab3"). InnerVolumeSpecName "kube-api-access-x6cp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:59:07 crc kubenswrapper[4802]: I1201 19:59:07.366705 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/081dccc6-dbee-40a9-8333-d1c178e3fab3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "081dccc6-dbee-40a9-8333-d1c178e3fab3" (UID: "081dccc6-dbee-40a9-8333-d1c178e3fab3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 19:59:07 crc kubenswrapper[4802]: I1201 19:59:07.461641 4802 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/081dccc6-dbee-40a9-8333-d1c178e3fab3-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 19:59:07 crc kubenswrapper[4802]: I1201 19:59:07.461674 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6cp5\" (UniqueName: \"kubernetes.io/projected/081dccc6-dbee-40a9-8333-d1c178e3fab3-kube-api-access-x6cp5\") on node \"crc\" DevicePath \"\"" Dec 01 19:59:07 crc kubenswrapper[4802]: I1201 19:59:07.461685 4802 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/081dccc6-dbee-40a9-8333-d1c178e3fab3-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 19:59:07 crc kubenswrapper[4802]: I1201 19:59:07.811842 4802 patch_prober.go:28] interesting pod/router-default-5444994796-lvx7b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 19:59:07 crc kubenswrapper[4802]: [-]has-synced failed: reason withheld Dec 01 19:59:07 crc kubenswrapper[4802]: [+]process-running ok Dec 01 19:59:07 crc kubenswrapper[4802]: healthz check failed Dec 01 19:59:07 crc kubenswrapper[4802]: I1201 19:59:07.812150 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvx7b" podUID="7e98194d-958a-4c56-b5a3-90e01eab1816" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 19:59:07 crc kubenswrapper[4802]: I1201 19:59:07.879798 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410305-db4dv" event={"ID":"081dccc6-dbee-40a9-8333-d1c178e3fab3","Type":"ContainerDied","Data":"b3fec8a010de90205c8ecf7c6f5117e54dc9e532318635280854a01617672ba2"} Dec 01 19:59:07 crc kubenswrapper[4802]: I1201 19:59:07.879838 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3fec8a010de90205c8ecf7c6f5117e54dc9e532318635280854a01617672ba2" Dec 01 19:59:07 crc kubenswrapper[4802]: I1201 19:59:07.879892 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410305-db4dv" Dec 01 19:59:07 crc kubenswrapper[4802]: I1201 19:59:07.907289 4802 generic.go:334] "Generic (PLEG): container finished" podID="ee24ae84-374a-46a9-8816-59086ecab84e" containerID="df3b1c8e56f2ae47315fe7914d49d637b316e18f190a1ef763ab0a7470b6cd77" exitCode=0 Dec 01 19:59:07 crc kubenswrapper[4802]: I1201 19:59:07.908525 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z2tv2" event={"ID":"ee24ae84-374a-46a9-8816-59086ecab84e","Type":"ContainerDied","Data":"df3b1c8e56f2ae47315fe7914d49d637b316e18f190a1ef763ab0a7470b6cd77"} Dec 01 19:59:07 crc kubenswrapper[4802]: I1201 19:59:07.908546 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z2tv2" event={"ID":"ee24ae84-374a-46a9-8816-59086ecab84e","Type":"ContainerStarted","Data":"36f815ed245c4973ba57617035ef9417d6afda251881b767db85dc31ab1a93d8"} Dec 01 19:59:08 crc kubenswrapper[4802]: I1201 19:59:08.539248 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 19:59:08 crc kubenswrapper[4802]: E1201 19:59:08.539467 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="081dccc6-dbee-40a9-8333-d1c178e3fab3" containerName="collect-profiles" Dec 01 19:59:08 crc kubenswrapper[4802]: I1201 19:59:08.539479 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="081dccc6-dbee-40a9-8333-d1c178e3fab3" containerName="collect-profiles" Dec 01 19:59:08 crc kubenswrapper[4802]: I1201 19:59:08.539572 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="081dccc6-dbee-40a9-8333-d1c178e3fab3" containerName="collect-profiles" Dec 01 19:59:08 crc kubenswrapper[4802]: I1201 19:59:08.539956 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 19:59:08 crc kubenswrapper[4802]: I1201 19:59:08.542726 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 01 19:59:08 crc kubenswrapper[4802]: I1201 19:59:08.542749 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 01 19:59:08 crc kubenswrapper[4802]: I1201 19:59:08.553694 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 19:59:08 crc kubenswrapper[4802]: I1201 19:59:08.679668 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/25585a57-d017-4c7a-839b-819a01e647fa-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"25585a57-d017-4c7a-839b-819a01e647fa\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 19:59:08 crc kubenswrapper[4802]: I1201 19:59:08.679757 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25585a57-d017-4c7a-839b-819a01e647fa-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"25585a57-d017-4c7a-839b-819a01e647fa\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 19:59:08 crc kubenswrapper[4802]: I1201 19:59:08.780794 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25585a57-d017-4c7a-839b-819a01e647fa-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"25585a57-d017-4c7a-839b-819a01e647fa\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 19:59:08 crc kubenswrapper[4802]: I1201 19:59:08.780901 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/25585a57-d017-4c7a-839b-819a01e647fa-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"25585a57-d017-4c7a-839b-819a01e647fa\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 19:59:08 crc kubenswrapper[4802]: I1201 19:59:08.780994 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/25585a57-d017-4c7a-839b-819a01e647fa-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"25585a57-d017-4c7a-839b-819a01e647fa\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 19:59:08 crc kubenswrapper[4802]: I1201 19:59:08.798831 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25585a57-d017-4c7a-839b-819a01e647fa-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"25585a57-d017-4c7a-839b-819a01e647fa\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 19:59:08 crc kubenswrapper[4802]: I1201 19:59:08.810373 4802 patch_prober.go:28] interesting pod/router-default-5444994796-lvx7b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 19:59:08 crc kubenswrapper[4802]: [-]has-synced failed: reason withheld Dec 01 19:59:08 crc kubenswrapper[4802]: [+]process-running ok Dec 01 19:59:08 crc kubenswrapper[4802]: healthz check failed Dec 01 19:59:08 crc kubenswrapper[4802]: I1201 19:59:08.810424 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvx7b" podUID="7e98194d-958a-4c56-b5a3-90e01eab1816" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 19:59:08 crc kubenswrapper[4802]: I1201 19:59:08.870827 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 19:59:09 crc kubenswrapper[4802]: I1201 19:59:09.810719 4802 patch_prober.go:28] interesting pod/router-default-5444994796-lvx7b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 19:59:09 crc kubenswrapper[4802]: [-]has-synced failed: reason withheld Dec 01 19:59:09 crc kubenswrapper[4802]: [+]process-running ok Dec 01 19:59:09 crc kubenswrapper[4802]: healthz check failed Dec 01 19:59:09 crc kubenswrapper[4802]: I1201 19:59:09.811081 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvx7b" podUID="7e98194d-958a-4c56-b5a3-90e01eab1816" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 19:59:10 crc kubenswrapper[4802]: I1201 19:59:10.810816 4802 patch_prober.go:28] interesting pod/router-default-5444994796-lvx7b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 19:59:10 crc kubenswrapper[4802]: [-]has-synced failed: reason withheld Dec 01 19:59:10 crc kubenswrapper[4802]: [+]process-running ok Dec 01 19:59:10 crc kubenswrapper[4802]: healthz check failed Dec 01 19:59:10 crc kubenswrapper[4802]: I1201 19:59:10.811106 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvx7b" podUID="7e98194d-958a-4c56-b5a3-90e01eab1816" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 19:59:11 crc kubenswrapper[4802]: I1201 19:59:11.809174 4802 patch_prober.go:28] interesting pod/router-default-5444994796-lvx7b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 19:59:11 crc kubenswrapper[4802]: [-]has-synced failed: reason withheld Dec 01 19:59:11 crc kubenswrapper[4802]: [+]process-running ok Dec 01 19:59:11 crc kubenswrapper[4802]: healthz check failed Dec 01 19:59:11 crc kubenswrapper[4802]: I1201 19:59:11.809236 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvx7b" podUID="7e98194d-958a-4c56-b5a3-90e01eab1816" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 19:59:11 crc kubenswrapper[4802]: I1201 19:59:11.945579 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-g7fvq" Dec 01 19:59:12 crc kubenswrapper[4802]: I1201 19:59:12.810073 4802 patch_prober.go:28] interesting pod/router-default-5444994796-lvx7b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 19:59:12 crc kubenswrapper[4802]: [-]has-synced failed: reason withheld Dec 01 19:59:12 crc kubenswrapper[4802]: [+]process-running ok Dec 01 19:59:12 crc kubenswrapper[4802]: healthz check failed Dec 01 19:59:12 crc kubenswrapper[4802]: I1201 19:59:12.810375 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvx7b" podUID="7e98194d-958a-4c56-b5a3-90e01eab1816" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 19:59:13 crc kubenswrapper[4802]: I1201 19:59:13.810357 4802 patch_prober.go:28] interesting pod/router-default-5444994796-lvx7b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 19:59:13 crc kubenswrapper[4802]: [-]has-synced failed: reason withheld Dec 01 19:59:13 crc kubenswrapper[4802]: [+]process-running ok Dec 01 19:59:13 crc kubenswrapper[4802]: healthz check failed Dec 01 19:59:13 crc kubenswrapper[4802]: I1201 19:59:13.810415 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvx7b" podUID="7e98194d-958a-4c56-b5a3-90e01eab1816" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 19:59:14 crc kubenswrapper[4802]: I1201 19:59:14.528550 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 19:59:14 crc kubenswrapper[4802]: I1201 19:59:14.673967 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/983c4f79-5c28-4f5c-8c4b-e876ad0d7e00-kubelet-dir\") pod \"983c4f79-5c28-4f5c-8c4b-e876ad0d7e00\" (UID: \"983c4f79-5c28-4f5c-8c4b-e876ad0d7e00\") " Dec 01 19:59:14 crc kubenswrapper[4802]: I1201 19:59:14.674098 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/983c4f79-5c28-4f5c-8c4b-e876ad0d7e00-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "983c4f79-5c28-4f5c-8c4b-e876ad0d7e00" (UID: "983c4f79-5c28-4f5c-8c4b-e876ad0d7e00"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 19:59:14 crc kubenswrapper[4802]: I1201 19:59:14.674344 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/983c4f79-5c28-4f5c-8c4b-e876ad0d7e00-kube-api-access\") pod \"983c4f79-5c28-4f5c-8c4b-e876ad0d7e00\" (UID: \"983c4f79-5c28-4f5c-8c4b-e876ad0d7e00\") " Dec 01 19:59:14 crc kubenswrapper[4802]: I1201 19:59:14.674728 4802 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/983c4f79-5c28-4f5c-8c4b-e876ad0d7e00-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 19:59:14 crc kubenswrapper[4802]: I1201 19:59:14.681660 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/983c4f79-5c28-4f5c-8c4b-e876ad0d7e00-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "983c4f79-5c28-4f5c-8c4b-e876ad0d7e00" (UID: "983c4f79-5c28-4f5c-8c4b-e876ad0d7e00"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:59:14 crc kubenswrapper[4802]: I1201 19:59:14.775832 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/983c4f79-5c28-4f5c-8c4b-e876ad0d7e00-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 19:59:14 crc kubenswrapper[4802]: I1201 19:59:14.810506 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-lvx7b" Dec 01 19:59:14 crc kubenswrapper[4802]: I1201 19:59:14.812964 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-lvx7b" Dec 01 19:59:14 crc kubenswrapper[4802]: I1201 19:59:14.949418 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"983c4f79-5c28-4f5c-8c4b-e876ad0d7e00","Type":"ContainerDied","Data":"a1b11dcdbdba25430e8d309a54d67c06e5c83d05f91f355fdc223889f263eb96"} Dec 01 19:59:14 crc kubenswrapper[4802]: I1201 19:59:14.949461 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 19:59:14 crc kubenswrapper[4802]: I1201 19:59:14.949469 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1b11dcdbdba25430e8d309a54d67c06e5c83d05f91f355fdc223889f263eb96" Dec 01 19:59:15 crc kubenswrapper[4802]: I1201 19:59:15.694412 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 19:59:16 crc kubenswrapper[4802]: I1201 19:59:16.252189 4802 patch_prober.go:28] interesting pod/console-f9d7485db-62kxj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Dec 01 19:59:16 crc kubenswrapper[4802]: I1201 19:59:16.252550 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-62kxj" podUID="9f3283e5-38b2-4f3e-a0d4-122b734e79d4" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Dec 01 19:59:16 crc kubenswrapper[4802]: I1201 19:59:16.488261 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 19:59:16 crc kubenswrapper[4802]: W1201 19:59:16.492505 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod25585a57_d017_4c7a_839b_819a01e647fa.slice/crio-e1848dbf1f53fd18bda77cd4b96e09fed02fa24ddb4fba0aa66233081b13b7ba WatchSource:0}: Error finding container e1848dbf1f53fd18bda77cd4b96e09fed02fa24ddb4fba0aa66233081b13b7ba: Status 404 returned error can't find the container with id e1848dbf1f53fd18bda77cd4b96e09fed02fa24ddb4fba0aa66233081b13b7ba Dec 01 19:59:16 crc kubenswrapper[4802]: I1201 19:59:16.959030 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"25585a57-d017-4c7a-839b-819a01e647fa","Type":"ContainerStarted","Data":"e1848dbf1f53fd18bda77cd4b96e09fed02fa24ddb4fba0aa66233081b13b7ba"} Dec 01 19:59:17 crc kubenswrapper[4802]: I1201 19:59:17.966346 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"25585a57-d017-4c7a-839b-819a01e647fa","Type":"ContainerStarted","Data":"0f30de561830d889599948bd9a07739b45faa5737203ad353fbcb55b842cc31f"} Dec 01 19:59:17 crc kubenswrapper[4802]: I1201 19:59:17.981256 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=9.981231683 podStartE2EDuration="9.981231683s" podCreationTimestamp="2025-12-01 19:59:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:17.98111985 +0000 UTC m=+179.543679491" watchObservedRunningTime="2025-12-01 19:59:17.981231683 +0000 UTC m=+179.543791334" Dec 01 19:59:18 crc kubenswrapper[4802]: I1201 19:59:18.971381 4802 generic.go:334] "Generic (PLEG): container finished" podID="25585a57-d017-4c7a-839b-819a01e647fa" containerID="0f30de561830d889599948bd9a07739b45faa5737203ad353fbcb55b842cc31f" exitCode=0 Dec 01 19:59:18 crc kubenswrapper[4802]: I1201 19:59:18.971545 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"25585a57-d017-4c7a-839b-819a01e647fa","Type":"ContainerDied","Data":"0f30de561830d889599948bd9a07739b45faa5737203ad353fbcb55b842cc31f"} Dec 01 19:59:22 crc kubenswrapper[4802]: I1201 19:59:22.865699 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 19:59:26 crc kubenswrapper[4802]: I1201 19:59:26.258035 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-62kxj" Dec 01 19:59:26 crc kubenswrapper[4802]: I1201 19:59:26.265398 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-62kxj" Dec 01 19:59:28 crc kubenswrapper[4802]: I1201 19:59:28.089903 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 19:59:28 crc kubenswrapper[4802]: I1201 19:59:28.089993 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 19:59:28 crc kubenswrapper[4802]: I1201 19:59:28.572562 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 19:59:28 crc kubenswrapper[4802]: I1201 19:59:28.583270 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25585a57-d017-4c7a-839b-819a01e647fa-kube-api-access\") pod \"25585a57-d017-4c7a-839b-819a01e647fa\" (UID: \"25585a57-d017-4c7a-839b-819a01e647fa\") " Dec 01 19:59:28 crc kubenswrapper[4802]: I1201 19:59:28.583426 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/25585a57-d017-4c7a-839b-819a01e647fa-kubelet-dir\") pod \"25585a57-d017-4c7a-839b-819a01e647fa\" (UID: \"25585a57-d017-4c7a-839b-819a01e647fa\") " Dec 01 19:59:28 crc kubenswrapper[4802]: I1201 19:59:28.583472 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25585a57-d017-4c7a-839b-819a01e647fa-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "25585a57-d017-4c7a-839b-819a01e647fa" (UID: "25585a57-d017-4c7a-839b-819a01e647fa"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 19:59:28 crc kubenswrapper[4802]: I1201 19:59:28.583772 4802 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/25585a57-d017-4c7a-839b-819a01e647fa-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 19:59:28 crc kubenswrapper[4802]: I1201 19:59:28.588595 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25585a57-d017-4c7a-839b-819a01e647fa-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "25585a57-d017-4c7a-839b-819a01e647fa" (UID: "25585a57-d017-4c7a-839b-819a01e647fa"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:59:28 crc kubenswrapper[4802]: I1201 19:59:28.685324 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25585a57-d017-4c7a-839b-819a01e647fa-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 19:59:29 crc kubenswrapper[4802]: I1201 19:59:29.027034 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"25585a57-d017-4c7a-839b-819a01e647fa","Type":"ContainerDied","Data":"e1848dbf1f53fd18bda77cd4b96e09fed02fa24ddb4fba0aa66233081b13b7ba"} Dec 01 19:59:29 crc kubenswrapper[4802]: I1201 19:59:29.027071 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1848dbf1f53fd18bda77cd4b96e09fed02fa24ddb4fba0aa66233081b13b7ba" Dec 01 19:59:29 crc kubenswrapper[4802]: I1201 19:59:29.027082 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 19:59:36 crc kubenswrapper[4802]: I1201 19:59:36.827381 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dml5n" Dec 01 19:59:38 crc kubenswrapper[4802]: E1201 19:59:38.931761 4802 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 01 19:59:38 crc kubenswrapper[4802]: E1201 19:59:38.932269 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9n56l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-54h5m_openshift-marketplace(edeba60a-7f88-48b8-a016-324a7527a666): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 19:59:38 crc kubenswrapper[4802]: E1201 19:59:38.933397 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-54h5m" podUID="edeba60a-7f88-48b8-a016-324a7527a666" Dec 01 19:59:39 crc kubenswrapper[4802]: E1201 19:59:39.026864 4802 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 01 19:59:39 crc kubenswrapper[4802]: E1201 19:59:39.027045 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-58m2r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bph4q_openshift-marketplace(1ef3009d-6227-4034-8325-544c3386a9fd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 19:59:39 crc kubenswrapper[4802]: E1201 19:59:39.028486 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-bph4q" podUID="1ef3009d-6227-4034-8325-544c3386a9fd" Dec 01 19:59:40 crc kubenswrapper[4802]: E1201 19:59:40.432104 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bph4q" podUID="1ef3009d-6227-4034-8325-544c3386a9fd" Dec 01 19:59:40 crc kubenswrapper[4802]: E1201 19:59:40.432179 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-54h5m" podUID="edeba60a-7f88-48b8-a016-324a7527a666" Dec 01 19:59:40 crc kubenswrapper[4802]: E1201 19:59:40.500473 4802 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 01 19:59:40 crc kubenswrapper[4802]: E1201 19:59:40.500633 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c49j2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-dtjsm_openshift-marketplace(3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 19:59:40 crc kubenswrapper[4802]: E1201 19:59:40.502019 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-dtjsm" podUID="3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b" Dec 01 19:59:40 crc kubenswrapper[4802]: E1201 19:59:40.544010 4802 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 01 19:59:40 crc kubenswrapper[4802]: E1201 19:59:40.544137 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gh59m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-qvm7z_openshift-marketplace(225bc123-af5b-421f-b8e4-a3a13ce3ee58): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 19:59:40 crc kubenswrapper[4802]: E1201 19:59:40.545544 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-qvm7z" podUID="225bc123-af5b-421f-b8e4-a3a13ce3ee58" Dec 01 19:59:41 crc kubenswrapper[4802]: E1201 19:59:41.329787 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-dtjsm" podUID="3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b" Dec 01 19:59:41 crc kubenswrapper[4802]: E1201 19:59:41.329814 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-qvm7z" podUID="225bc123-af5b-421f-b8e4-a3a13ce3ee58" Dec 01 19:59:41 crc kubenswrapper[4802]: E1201 19:59:41.421769 4802 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 01 19:59:41 crc kubenswrapper[4802]: E1201 19:59:41.421924 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mwszf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-k9p7x_openshift-marketplace(d4f1a65e-833b-4450-a3b1-7d4f67f05fb5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 19:59:41 crc kubenswrapper[4802]: E1201 19:59:41.423479 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-k9p7x" podUID="d4f1a65e-833b-4450-a3b1-7d4f67f05fb5" Dec 01 19:59:41 crc kubenswrapper[4802]: E1201 19:59:41.437103 4802 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 01 19:59:41 crc kubenswrapper[4802]: E1201 19:59:41.437481 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9z65l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-jkqxz_openshift-marketplace(f4850b32-033b-451d-af1d-fad64baead63): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 19:59:41 crc kubenswrapper[4802]: E1201 19:59:41.438688 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-jkqxz" podUID="f4850b32-033b-451d-af1d-fad64baead63" Dec 01 19:59:43 crc kubenswrapper[4802]: E1201 19:59:43.954375 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-jkqxz" podUID="f4850b32-033b-451d-af1d-fad64baead63" Dec 01 19:59:43 crc kubenswrapper[4802]: E1201 19:59:43.954415 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-k9p7x" podUID="d4f1a65e-833b-4450-a3b1-7d4f67f05fb5" Dec 01 19:59:43 crc kubenswrapper[4802]: E1201 19:59:43.976741 4802 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 01 19:59:43 crc kubenswrapper[4802]: E1201 19:59:43.976943 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ndhmb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-c6vvc_openshift-marketplace(2f8519d9-5b33-4c4d-b430-6497b8bcc71b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 19:59:43 crc kubenswrapper[4802]: E1201 19:59:43.978174 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-c6vvc" podUID="2f8519d9-5b33-4c4d-b430-6497b8bcc71b" Dec 01 19:59:44 crc kubenswrapper[4802]: E1201 19:59:44.113325 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-c6vvc" podUID="2f8519d9-5b33-4c4d-b430-6497b8bcc71b" Dec 01 19:59:44 crc kubenswrapper[4802]: I1201 19:59:44.739446 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 19:59:44 crc kubenswrapper[4802]: E1201 19:59:44.739709 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25585a57-d017-4c7a-839b-819a01e647fa" containerName="pruner" Dec 01 19:59:44 crc kubenswrapper[4802]: I1201 19:59:44.739724 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="25585a57-d017-4c7a-839b-819a01e647fa" containerName="pruner" Dec 01 19:59:44 crc kubenswrapper[4802]: E1201 19:59:44.739747 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="983c4f79-5c28-4f5c-8c4b-e876ad0d7e00" containerName="pruner" Dec 01 19:59:44 crc kubenswrapper[4802]: I1201 19:59:44.739754 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="983c4f79-5c28-4f5c-8c4b-e876ad0d7e00" containerName="pruner" Dec 01 19:59:44 crc kubenswrapper[4802]: I1201 19:59:44.739868 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="25585a57-d017-4c7a-839b-819a01e647fa" containerName="pruner" Dec 01 19:59:44 crc kubenswrapper[4802]: I1201 19:59:44.739889 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="983c4f79-5c28-4f5c-8c4b-e876ad0d7e00" containerName="pruner" Dec 01 19:59:44 crc kubenswrapper[4802]: I1201 19:59:44.740393 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 19:59:44 crc kubenswrapper[4802]: I1201 19:59:44.742395 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 01 19:59:44 crc kubenswrapper[4802]: I1201 19:59:44.743167 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 01 19:59:44 crc kubenswrapper[4802]: I1201 19:59:44.758238 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 19:59:44 crc kubenswrapper[4802]: I1201 19:59:44.812074 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e045f3ff-764d-4b0b-9571-717eaed7587d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e045f3ff-764d-4b0b-9571-717eaed7587d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 19:59:44 crc kubenswrapper[4802]: I1201 19:59:44.812449 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e045f3ff-764d-4b0b-9571-717eaed7587d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e045f3ff-764d-4b0b-9571-717eaed7587d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 19:59:44 crc kubenswrapper[4802]: I1201 19:59:44.913891 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e045f3ff-764d-4b0b-9571-717eaed7587d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e045f3ff-764d-4b0b-9571-717eaed7587d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 19:59:44 crc kubenswrapper[4802]: I1201 19:59:44.913998 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e045f3ff-764d-4b0b-9571-717eaed7587d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e045f3ff-764d-4b0b-9571-717eaed7587d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 19:59:44 crc kubenswrapper[4802]: I1201 19:59:44.914566 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e045f3ff-764d-4b0b-9571-717eaed7587d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e045f3ff-764d-4b0b-9571-717eaed7587d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 19:59:44 crc kubenswrapper[4802]: I1201 19:59:44.940241 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e045f3ff-764d-4b0b-9571-717eaed7587d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e045f3ff-764d-4b0b-9571-717eaed7587d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 19:59:45 crc kubenswrapper[4802]: I1201 19:59:45.054518 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 19:59:45 crc kubenswrapper[4802]: I1201 19:59:45.119828 4802 generic.go:334] "Generic (PLEG): container finished" podID="ee24ae84-374a-46a9-8816-59086ecab84e" containerID="1ba9267003bcc38951aee559b0e13b59e7d7408d7fecceb9be2997382c2c3f43" exitCode=0 Dec 01 19:59:45 crc kubenswrapper[4802]: I1201 19:59:45.119859 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z2tv2" event={"ID":"ee24ae84-374a-46a9-8816-59086ecab84e","Type":"ContainerDied","Data":"1ba9267003bcc38951aee559b0e13b59e7d7408d7fecceb9be2997382c2c3f43"} Dec 01 19:59:45 crc kubenswrapper[4802]: I1201 19:59:45.446336 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 19:59:45 crc kubenswrapper[4802]: W1201 19:59:45.453714 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode045f3ff_764d_4b0b_9571_717eaed7587d.slice/crio-a4ae1b92921cdd11e853c76ea3d624346826a8241ba1cf3725b1915f15f08733 WatchSource:0}: Error finding container a4ae1b92921cdd11e853c76ea3d624346826a8241ba1cf3725b1915f15f08733: Status 404 returned error can't find the container with id a4ae1b92921cdd11e853c76ea3d624346826a8241ba1cf3725b1915f15f08733 Dec 01 19:59:46 crc kubenswrapper[4802]: I1201 19:59:46.126504 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e045f3ff-764d-4b0b-9571-717eaed7587d","Type":"ContainerStarted","Data":"8ade45b341e70ec1ef6b5d8d1ca9fc370a35b030fdbd1d9d3fd1cbc96f92ebc8"} Dec 01 19:59:46 crc kubenswrapper[4802]: I1201 19:59:46.126844 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e045f3ff-764d-4b0b-9571-717eaed7587d","Type":"ContainerStarted","Data":"a4ae1b92921cdd11e853c76ea3d624346826a8241ba1cf3725b1915f15f08733"} Dec 01 19:59:46 crc kubenswrapper[4802]: I1201 19:59:46.128515 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z2tv2" event={"ID":"ee24ae84-374a-46a9-8816-59086ecab84e","Type":"ContainerStarted","Data":"148102381a6a8e47c45f5d7e829a9efb40eee39d8b45cc022d8ad3bcb58250e7"} Dec 01 19:59:46 crc kubenswrapper[4802]: I1201 19:59:46.142113 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.142092951 podStartE2EDuration="2.142092951s" podCreationTimestamp="2025-12-01 19:59:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:46.140231154 +0000 UTC m=+207.702790795" watchObservedRunningTime="2025-12-01 19:59:46.142092951 +0000 UTC m=+207.704652602" Dec 01 19:59:46 crc kubenswrapper[4802]: I1201 19:59:46.616159 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z2tv2" Dec 01 19:59:46 crc kubenswrapper[4802]: I1201 19:59:46.616226 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z2tv2" Dec 01 19:59:47 crc kubenswrapper[4802]: I1201 19:59:47.134953 4802 generic.go:334] "Generic (PLEG): container finished" podID="e045f3ff-764d-4b0b-9571-717eaed7587d" containerID="8ade45b341e70ec1ef6b5d8d1ca9fc370a35b030fdbd1d9d3fd1cbc96f92ebc8" exitCode=0 Dec 01 19:59:47 crc kubenswrapper[4802]: I1201 19:59:47.136324 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e045f3ff-764d-4b0b-9571-717eaed7587d","Type":"ContainerDied","Data":"8ade45b341e70ec1ef6b5d8d1ca9fc370a35b030fdbd1d9d3fd1cbc96f92ebc8"} Dec 01 19:59:47 crc kubenswrapper[4802]: I1201 19:59:47.156403 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z2tv2" podStartSLOduration=9.953748554 podStartE2EDuration="41.156380408s" podCreationTimestamp="2025-12-01 19:59:06 +0000 UTC" firstStartedPulling="2025-12-01 19:59:14.467520995 +0000 UTC m=+176.030080636" lastFinishedPulling="2025-12-01 19:59:45.670152849 +0000 UTC m=+207.232712490" observedRunningTime="2025-12-01 19:59:46.160306746 +0000 UTC m=+207.722866387" watchObservedRunningTime="2025-12-01 19:59:47.156380408 +0000 UTC m=+208.718940079" Dec 01 19:59:47 crc kubenswrapper[4802]: I1201 19:59:47.674711 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z2tv2" podUID="ee24ae84-374a-46a9-8816-59086ecab84e" containerName="registry-server" probeResult="failure" output=< Dec 01 19:59:47 crc kubenswrapper[4802]: timeout: failed to connect service ":50051" within 1s Dec 01 19:59:47 crc kubenswrapper[4802]: > Dec 01 19:59:48 crc kubenswrapper[4802]: I1201 19:59:48.426576 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 19:59:48 crc kubenswrapper[4802]: I1201 19:59:48.471612 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e045f3ff-764d-4b0b-9571-717eaed7587d-kube-api-access\") pod \"e045f3ff-764d-4b0b-9571-717eaed7587d\" (UID: \"e045f3ff-764d-4b0b-9571-717eaed7587d\") " Dec 01 19:59:48 crc kubenswrapper[4802]: I1201 19:59:48.471674 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e045f3ff-764d-4b0b-9571-717eaed7587d-kubelet-dir\") pod \"e045f3ff-764d-4b0b-9571-717eaed7587d\" (UID: \"e045f3ff-764d-4b0b-9571-717eaed7587d\") " Dec 01 19:59:48 crc kubenswrapper[4802]: I1201 19:59:48.471999 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e045f3ff-764d-4b0b-9571-717eaed7587d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e045f3ff-764d-4b0b-9571-717eaed7587d" (UID: "e045f3ff-764d-4b0b-9571-717eaed7587d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 19:59:48 crc kubenswrapper[4802]: I1201 19:59:48.478684 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e045f3ff-764d-4b0b-9571-717eaed7587d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e045f3ff-764d-4b0b-9571-717eaed7587d" (UID: "e045f3ff-764d-4b0b-9571-717eaed7587d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:59:48 crc kubenswrapper[4802]: I1201 19:59:48.573702 4802 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e045f3ff-764d-4b0b-9571-717eaed7587d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 19:59:48 crc kubenswrapper[4802]: I1201 19:59:48.573745 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e045f3ff-764d-4b0b-9571-717eaed7587d-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 19:59:49 crc kubenswrapper[4802]: I1201 19:59:49.146937 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e045f3ff-764d-4b0b-9571-717eaed7587d","Type":"ContainerDied","Data":"a4ae1b92921cdd11e853c76ea3d624346826a8241ba1cf3725b1915f15f08733"} Dec 01 19:59:49 crc kubenswrapper[4802]: I1201 19:59:49.147223 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4ae1b92921cdd11e853c76ea3d624346826a8241ba1cf3725b1915f15f08733" Dec 01 19:59:49 crc kubenswrapper[4802]: I1201 19:59:49.147010 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 19:59:49 crc kubenswrapper[4802]: I1201 19:59:49.531537 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 19:59:49 crc kubenswrapper[4802]: E1201 19:59:49.531777 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e045f3ff-764d-4b0b-9571-717eaed7587d" containerName="pruner" Dec 01 19:59:49 crc kubenswrapper[4802]: I1201 19:59:49.531791 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="e045f3ff-764d-4b0b-9571-717eaed7587d" containerName="pruner" Dec 01 19:59:49 crc kubenswrapper[4802]: I1201 19:59:49.531898 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="e045f3ff-764d-4b0b-9571-717eaed7587d" containerName="pruner" Dec 01 19:59:49 crc kubenswrapper[4802]: I1201 19:59:49.532250 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 19:59:49 crc kubenswrapper[4802]: I1201 19:59:49.534243 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 01 19:59:49 crc kubenswrapper[4802]: I1201 19:59:49.534761 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 01 19:59:49 crc kubenswrapper[4802]: I1201 19:59:49.546519 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 19:59:49 crc kubenswrapper[4802]: I1201 19:59:49.586038 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5cbd352-e2b4-4650-8265-e7b26b8890b4-kube-api-access\") pod \"installer-9-crc\" (UID: \"b5cbd352-e2b4-4650-8265-e7b26b8890b4\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 19:59:49 crc kubenswrapper[4802]: I1201 19:59:49.586106 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b5cbd352-e2b4-4650-8265-e7b26b8890b4-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b5cbd352-e2b4-4650-8265-e7b26b8890b4\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 19:59:49 crc kubenswrapper[4802]: I1201 19:59:49.586179 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b5cbd352-e2b4-4650-8265-e7b26b8890b4-var-lock\") pod \"installer-9-crc\" (UID: \"b5cbd352-e2b4-4650-8265-e7b26b8890b4\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 19:59:49 crc kubenswrapper[4802]: I1201 19:59:49.688333 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b5cbd352-e2b4-4650-8265-e7b26b8890b4-var-lock\") pod \"installer-9-crc\" (UID: \"b5cbd352-e2b4-4650-8265-e7b26b8890b4\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 19:59:49 crc kubenswrapper[4802]: I1201 19:59:49.688460 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5cbd352-e2b4-4650-8265-e7b26b8890b4-kube-api-access\") pod \"installer-9-crc\" (UID: \"b5cbd352-e2b4-4650-8265-e7b26b8890b4\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 19:59:49 crc kubenswrapper[4802]: I1201 19:59:49.688506 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b5cbd352-e2b4-4650-8265-e7b26b8890b4-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b5cbd352-e2b4-4650-8265-e7b26b8890b4\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 19:59:49 crc kubenswrapper[4802]: I1201 19:59:49.688603 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b5cbd352-e2b4-4650-8265-e7b26b8890b4-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b5cbd352-e2b4-4650-8265-e7b26b8890b4\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 19:59:49 crc kubenswrapper[4802]: I1201 19:59:49.688641 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b5cbd352-e2b4-4650-8265-e7b26b8890b4-var-lock\") pod \"installer-9-crc\" (UID: \"b5cbd352-e2b4-4650-8265-e7b26b8890b4\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 19:59:49 crc kubenswrapper[4802]: I1201 19:59:49.710756 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5cbd352-e2b4-4650-8265-e7b26b8890b4-kube-api-access\") pod \"installer-9-crc\" (UID: \"b5cbd352-e2b4-4650-8265-e7b26b8890b4\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 19:59:49 crc kubenswrapper[4802]: I1201 19:59:49.849630 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 19:59:50 crc kubenswrapper[4802]: I1201 19:59:50.226044 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 19:59:50 crc kubenswrapper[4802]: W1201 19:59:50.233542 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb5cbd352_e2b4_4650_8265_e7b26b8890b4.slice/crio-cb9dcd1e02d623a2c1707b680e1c8c7c850bef81b22789b1daa3dc2fe56717d4 WatchSource:0}: Error finding container cb9dcd1e02d623a2c1707b680e1c8c7c850bef81b22789b1daa3dc2fe56717d4: Status 404 returned error can't find the container with id cb9dcd1e02d623a2c1707b680e1c8c7c850bef81b22789b1daa3dc2fe56717d4 Dec 01 19:59:51 crc kubenswrapper[4802]: I1201 19:59:51.158008 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b5cbd352-e2b4-4650-8265-e7b26b8890b4","Type":"ContainerStarted","Data":"cb9dcd1e02d623a2c1707b680e1c8c7c850bef81b22789b1daa3dc2fe56717d4"} Dec 01 19:59:52 crc kubenswrapper[4802]: I1201 19:59:52.163353 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b5cbd352-e2b4-4650-8265-e7b26b8890b4","Type":"ContainerStarted","Data":"e28f48f879146297575f78dbbce66e47a704c57059f8a4d43ae8cbd659a7f27c"} Dec 01 19:59:52 crc kubenswrapper[4802]: I1201 19:59:52.181991 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.18197157 podStartE2EDuration="3.18197157s" podCreationTimestamp="2025-12-01 19:59:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 19:59:52.179153053 +0000 UTC m=+213.741712704" watchObservedRunningTime="2025-12-01 19:59:52.18197157 +0000 UTC m=+213.744531221" Dec 01 19:59:55 crc kubenswrapper[4802]: I1201 19:59:55.467149 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nsqcw"] Dec 01 19:59:56 crc kubenswrapper[4802]: I1201 19:59:56.781291 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z2tv2" Dec 01 19:59:56 crc kubenswrapper[4802]: I1201 19:59:56.817502 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z2tv2" Dec 01 19:59:58 crc kubenswrapper[4802]: I1201 19:59:58.089145 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 19:59:58 crc kubenswrapper[4802]: I1201 19:59:58.089884 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 19:59:58 crc kubenswrapper[4802]: I1201 19:59:58.090021 4802 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 19:59:58 crc kubenswrapper[4802]: I1201 19:59:58.091093 4802 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809"} pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 19:59:58 crc kubenswrapper[4802]: I1201 19:59:58.091284 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" containerID="cri-o://3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809" gracePeriod=600 Dec 01 19:59:58 crc kubenswrapper[4802]: I1201 19:59:58.154401 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z2tv2"] Dec 01 19:59:58 crc kubenswrapper[4802]: I1201 19:59:58.200433 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z2tv2" podUID="ee24ae84-374a-46a9-8816-59086ecab84e" containerName="registry-server" containerID="cri-o://148102381a6a8e47c45f5d7e829a9efb40eee39d8b45cc022d8ad3bcb58250e7" gracePeriod=2 Dec 01 19:59:59 crc kubenswrapper[4802]: I1201 19:59:59.210790 4802 generic.go:334] "Generic (PLEG): container finished" podID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerID="3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809" exitCode=0 Dec 01 19:59:59 crc kubenswrapper[4802]: I1201 19:59:59.210942 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerDied","Data":"3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809"} Dec 01 19:59:59 crc kubenswrapper[4802]: I1201 19:59:59.214035 4802 generic.go:334] "Generic (PLEG): container finished" podID="ee24ae84-374a-46a9-8816-59086ecab84e" containerID="148102381a6a8e47c45f5d7e829a9efb40eee39d8b45cc022d8ad3bcb58250e7" exitCode=0 Dec 01 19:59:59 crc kubenswrapper[4802]: I1201 19:59:59.214071 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z2tv2" event={"ID":"ee24ae84-374a-46a9-8816-59086ecab84e","Type":"ContainerDied","Data":"148102381a6a8e47c45f5d7e829a9efb40eee39d8b45cc022d8ad3bcb58250e7"} Dec 01 19:59:59 crc kubenswrapper[4802]: I1201 19:59:59.275054 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z2tv2" Dec 01 19:59:59 crc kubenswrapper[4802]: I1201 19:59:59.309983 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dnd8\" (UniqueName: \"kubernetes.io/projected/ee24ae84-374a-46a9-8816-59086ecab84e-kube-api-access-2dnd8\") pod \"ee24ae84-374a-46a9-8816-59086ecab84e\" (UID: \"ee24ae84-374a-46a9-8816-59086ecab84e\") " Dec 01 19:59:59 crc kubenswrapper[4802]: I1201 19:59:59.310082 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee24ae84-374a-46a9-8816-59086ecab84e-catalog-content\") pod \"ee24ae84-374a-46a9-8816-59086ecab84e\" (UID: \"ee24ae84-374a-46a9-8816-59086ecab84e\") " Dec 01 19:59:59 crc kubenswrapper[4802]: I1201 19:59:59.310112 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee24ae84-374a-46a9-8816-59086ecab84e-utilities\") pod \"ee24ae84-374a-46a9-8816-59086ecab84e\" (UID: \"ee24ae84-374a-46a9-8816-59086ecab84e\") " Dec 01 19:59:59 crc kubenswrapper[4802]: I1201 19:59:59.311093 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee24ae84-374a-46a9-8816-59086ecab84e-utilities" (OuterVolumeSpecName: "utilities") pod "ee24ae84-374a-46a9-8816-59086ecab84e" (UID: "ee24ae84-374a-46a9-8816-59086ecab84e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 19:59:59 crc kubenswrapper[4802]: I1201 19:59:59.317174 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee24ae84-374a-46a9-8816-59086ecab84e-kube-api-access-2dnd8" (OuterVolumeSpecName: "kube-api-access-2dnd8") pod "ee24ae84-374a-46a9-8816-59086ecab84e" (UID: "ee24ae84-374a-46a9-8816-59086ecab84e"). InnerVolumeSpecName "kube-api-access-2dnd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 19:59:59 crc kubenswrapper[4802]: I1201 19:59:59.412085 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dnd8\" (UniqueName: \"kubernetes.io/projected/ee24ae84-374a-46a9-8816-59086ecab84e-kube-api-access-2dnd8\") on node \"crc\" DevicePath \"\"" Dec 01 19:59:59 crc kubenswrapper[4802]: I1201 19:59:59.412124 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee24ae84-374a-46a9-8816-59086ecab84e-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 19:59:59 crc kubenswrapper[4802]: I1201 19:59:59.436396 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee24ae84-374a-46a9-8816-59086ecab84e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee24ae84-374a-46a9-8816-59086ecab84e" (UID: "ee24ae84-374a-46a9-8816-59086ecab84e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 19:59:59 crc kubenswrapper[4802]: I1201 19:59:59.513805 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee24ae84-374a-46a9-8816-59086ecab84e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.131014 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410320-ptlrb"] Dec 01 20:00:00 crc kubenswrapper[4802]: E1201 20:00:00.132791 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee24ae84-374a-46a9-8816-59086ecab84e" containerName="extract-content" Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.132834 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee24ae84-374a-46a9-8816-59086ecab84e" containerName="extract-content" Dec 01 20:00:00 crc kubenswrapper[4802]: E1201 20:00:00.132883 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee24ae84-374a-46a9-8816-59086ecab84e" containerName="registry-server" Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.132893 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee24ae84-374a-46a9-8816-59086ecab84e" containerName="registry-server" Dec 01 20:00:00 crc kubenswrapper[4802]: E1201 20:00:00.132917 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee24ae84-374a-46a9-8816-59086ecab84e" containerName="extract-utilities" Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.132930 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee24ae84-374a-46a9-8816-59086ecab84e" containerName="extract-utilities" Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.133390 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee24ae84-374a-46a9-8816-59086ecab84e" containerName="registry-server" Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.134122 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410320-ptlrb" Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.137113 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.137541 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.153074 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410320-ptlrb"] Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.221995 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsjxh\" (UniqueName: \"kubernetes.io/projected/7c376edc-1f4d-4651-ad0c-cdb7b1412a6c-kube-api-access-tsjxh\") pod \"collect-profiles-29410320-ptlrb\" (UID: \"7c376edc-1f4d-4651-ad0c-cdb7b1412a6c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410320-ptlrb" Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.222083 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c376edc-1f4d-4651-ad0c-cdb7b1412a6c-secret-volume\") pod \"collect-profiles-29410320-ptlrb\" (UID: \"7c376edc-1f4d-4651-ad0c-cdb7b1412a6c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410320-ptlrb" Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.222161 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c376edc-1f4d-4651-ad0c-cdb7b1412a6c-config-volume\") pod \"collect-profiles-29410320-ptlrb\" (UID: \"7c376edc-1f4d-4651-ad0c-cdb7b1412a6c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410320-ptlrb" Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.225848 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerStarted","Data":"41318f41b1d38715fc9ba5256975e0f7c998f38cecb3c870cf77cc1261f579e2"} Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.228571 4802 generic.go:334] "Generic (PLEG): container finished" podID="3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b" containerID="015b2a2cd71b2686455cd2930b082b4f9b0d357c6b0339a5a8d4dfc897b36b03" exitCode=0 Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.228625 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtjsm" event={"ID":"3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b","Type":"ContainerDied","Data":"015b2a2cd71b2686455cd2930b082b4f9b0d357c6b0339a5a8d4dfc897b36b03"} Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.230945 4802 generic.go:334] "Generic (PLEG): container finished" podID="2f8519d9-5b33-4c4d-b430-6497b8bcc71b" containerID="f7fb812ae1b0870bb01d3bd227a151bfd3ef8c8d113fd43683b19fd4415c4f67" exitCode=0 Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.231065 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6vvc" event={"ID":"2f8519d9-5b33-4c4d-b430-6497b8bcc71b","Type":"ContainerDied","Data":"f7fb812ae1b0870bb01d3bd227a151bfd3ef8c8d113fd43683b19fd4415c4f67"} Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.234688 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z2tv2" event={"ID":"ee24ae84-374a-46a9-8816-59086ecab84e","Type":"ContainerDied","Data":"36f815ed245c4973ba57617035ef9417d6afda251881b767db85dc31ab1a93d8"} Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.234739 4802 scope.go:117] "RemoveContainer" containerID="148102381a6a8e47c45f5d7e829a9efb40eee39d8b45cc022d8ad3bcb58250e7" Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.234887 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z2tv2" Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.240851 4802 generic.go:334] "Generic (PLEG): container finished" podID="225bc123-af5b-421f-b8e4-a3a13ce3ee58" containerID="90cb9bc3039e0bcdc2af61aef1c2c68a587ebc280471bf5baa80ad774774de95" exitCode=0 Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.240941 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qvm7z" event={"ID":"225bc123-af5b-421f-b8e4-a3a13ce3ee58","Type":"ContainerDied","Data":"90cb9bc3039e0bcdc2af61aef1c2c68a587ebc280471bf5baa80ad774774de95"} Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.245015 4802 generic.go:334] "Generic (PLEG): container finished" podID="1ef3009d-6227-4034-8325-544c3386a9fd" containerID="759da204a667308b56f6747b2ff785161a756a324bc80faa407bfda5756037ad" exitCode=0 Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.245353 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bph4q" event={"ID":"1ef3009d-6227-4034-8325-544c3386a9fd","Type":"ContainerDied","Data":"759da204a667308b56f6747b2ff785161a756a324bc80faa407bfda5756037ad"} Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.248314 4802 generic.go:334] "Generic (PLEG): container finished" podID="edeba60a-7f88-48b8-a016-324a7527a666" containerID="d8ba24a487659272db48d50f59d5c0e8bc9e92a8e69a1a079f417fb2ee24ec77" exitCode=0 Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.248752 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54h5m" event={"ID":"edeba60a-7f88-48b8-a016-324a7527a666","Type":"ContainerDied","Data":"d8ba24a487659272db48d50f59d5c0e8bc9e92a8e69a1a079f417fb2ee24ec77"} Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.255907 4802 generic.go:334] "Generic (PLEG): container finished" podID="d4f1a65e-833b-4450-a3b1-7d4f67f05fb5" containerID="0116d55eba32961f51911f0e9406982b986658788f480f4ec75cf8ef62659ae9" exitCode=0 Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.255967 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k9p7x" event={"ID":"d4f1a65e-833b-4450-a3b1-7d4f67f05fb5","Type":"ContainerDied","Data":"0116d55eba32961f51911f0e9406982b986658788f480f4ec75cf8ef62659ae9"} Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.277079 4802 scope.go:117] "RemoveContainer" containerID="1ba9267003bcc38951aee559b0e13b59e7d7408d7fecceb9be2997382c2c3f43" Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.301003 4802 scope.go:117] "RemoveContainer" containerID="df3b1c8e56f2ae47315fe7914d49d637b316e18f190a1ef763ab0a7470b6cd77" Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.323212 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsjxh\" (UniqueName: \"kubernetes.io/projected/7c376edc-1f4d-4651-ad0c-cdb7b1412a6c-kube-api-access-tsjxh\") pod \"collect-profiles-29410320-ptlrb\" (UID: \"7c376edc-1f4d-4651-ad0c-cdb7b1412a6c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410320-ptlrb" Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.323286 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c376edc-1f4d-4651-ad0c-cdb7b1412a6c-secret-volume\") pod \"collect-profiles-29410320-ptlrb\" (UID: \"7c376edc-1f4d-4651-ad0c-cdb7b1412a6c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410320-ptlrb" Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.323363 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c376edc-1f4d-4651-ad0c-cdb7b1412a6c-config-volume\") pod \"collect-profiles-29410320-ptlrb\" (UID: \"7c376edc-1f4d-4651-ad0c-cdb7b1412a6c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410320-ptlrb" Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.326308 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c376edc-1f4d-4651-ad0c-cdb7b1412a6c-config-volume\") pod \"collect-profiles-29410320-ptlrb\" (UID: \"7c376edc-1f4d-4651-ad0c-cdb7b1412a6c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410320-ptlrb" Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.331464 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c376edc-1f4d-4651-ad0c-cdb7b1412a6c-secret-volume\") pod \"collect-profiles-29410320-ptlrb\" (UID: \"7c376edc-1f4d-4651-ad0c-cdb7b1412a6c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410320-ptlrb" Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.352181 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsjxh\" (UniqueName: \"kubernetes.io/projected/7c376edc-1f4d-4651-ad0c-cdb7b1412a6c-kube-api-access-tsjxh\") pod \"collect-profiles-29410320-ptlrb\" (UID: \"7c376edc-1f4d-4651-ad0c-cdb7b1412a6c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410320-ptlrb" Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.379069 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z2tv2"] Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.381896 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z2tv2"] Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.459601 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410320-ptlrb" Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.738385 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee24ae84-374a-46a9-8816-59086ecab84e" path="/var/lib/kubelet/pods/ee24ae84-374a-46a9-8816-59086ecab84e/volumes" Dec 01 20:00:00 crc kubenswrapper[4802]: I1201 20:00:00.879740 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410320-ptlrb"] Dec 01 20:00:00 crc kubenswrapper[4802]: W1201 20:00:00.960397 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c376edc_1f4d_4651_ad0c_cdb7b1412a6c.slice/crio-e8712e27b55ad3a416e36e4a211a9599a2aa383144b918859db88ba5147505a5 WatchSource:0}: Error finding container e8712e27b55ad3a416e36e4a211a9599a2aa383144b918859db88ba5147505a5: Status 404 returned error can't find the container with id e8712e27b55ad3a416e36e4a211a9599a2aa383144b918859db88ba5147505a5 Dec 01 20:00:01 crc kubenswrapper[4802]: I1201 20:00:01.281965 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qvm7z" event={"ID":"225bc123-af5b-421f-b8e4-a3a13ce3ee58","Type":"ContainerStarted","Data":"81563d6cdb065ba58b870d6450951c9ea69a23e64e7e3017824d8dc41342a8a9"} Dec 01 20:00:01 crc kubenswrapper[4802]: I1201 20:00:01.284761 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410320-ptlrb" event={"ID":"7c376edc-1f4d-4651-ad0c-cdb7b1412a6c","Type":"ContainerStarted","Data":"5ccf237f66eb18a6d81651e774b4abc173c2f7cefc2728ef2947d6a60d97b7ca"} Dec 01 20:00:01 crc kubenswrapper[4802]: I1201 20:00:01.284818 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410320-ptlrb" event={"ID":"7c376edc-1f4d-4651-ad0c-cdb7b1412a6c","Type":"ContainerStarted","Data":"e8712e27b55ad3a416e36e4a211a9599a2aa383144b918859db88ba5147505a5"} Dec 01 20:00:01 crc kubenswrapper[4802]: I1201 20:00:01.290762 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bph4q" event={"ID":"1ef3009d-6227-4034-8325-544c3386a9fd","Type":"ContainerStarted","Data":"9ef3834a416117100c9587e11c664c3b1747119fe77572267b8a4462cee75ad7"} Dec 01 20:00:01 crc kubenswrapper[4802]: I1201 20:00:01.293800 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k9p7x" event={"ID":"d4f1a65e-833b-4450-a3b1-7d4f67f05fb5","Type":"ContainerStarted","Data":"cfee11450d517ccde8729166af1baf922ff181cf5ded64da4732e5bf8687e15e"} Dec 01 20:00:01 crc kubenswrapper[4802]: I1201 20:00:01.298281 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6vvc" event={"ID":"2f8519d9-5b33-4c4d-b430-6497b8bcc71b","Type":"ContainerStarted","Data":"5697dd74be08b6b68fb9fb225ea44b1781a2eb0e0429ac647f728fee279a3933"} Dec 01 20:00:01 crc kubenswrapper[4802]: I1201 20:00:01.301398 4802 generic.go:334] "Generic (PLEG): container finished" podID="f4850b32-033b-451d-af1d-fad64baead63" containerID="fd3b7a9e8c6376295ceecb346b0e0989878127ad9b8003efec71409f710dd046" exitCode=0 Dec 01 20:00:01 crc kubenswrapper[4802]: I1201 20:00:01.301461 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jkqxz" event={"ID":"f4850b32-033b-451d-af1d-fad64baead63","Type":"ContainerDied","Data":"fd3b7a9e8c6376295ceecb346b0e0989878127ad9b8003efec71409f710dd046"} Dec 01 20:00:01 crc kubenswrapper[4802]: I1201 20:00:01.315911 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54h5m" event={"ID":"edeba60a-7f88-48b8-a016-324a7527a666","Type":"ContainerStarted","Data":"6720324e6ea1ae4a86167f80a8e212be83e537880e9604d9d5964e1e1616bf11"} Dec 01 20:00:01 crc kubenswrapper[4802]: I1201 20:00:01.320801 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qvm7z" podStartSLOduration=2.147275484 podStartE2EDuration="58.320773202s" podCreationTimestamp="2025-12-01 19:59:03 +0000 UTC" firstStartedPulling="2025-12-01 19:59:04.816179973 +0000 UTC m=+166.378739614" lastFinishedPulling="2025-12-01 20:00:00.989677691 +0000 UTC m=+222.552237332" observedRunningTime="2025-12-01 20:00:01.315141128 +0000 UTC m=+222.877700769" watchObservedRunningTime="2025-12-01 20:00:01.320773202 +0000 UTC m=+222.883332843" Dec 01 20:00:01 crc kubenswrapper[4802]: I1201 20:00:01.323922 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtjsm" event={"ID":"3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b","Type":"ContainerStarted","Data":"5ed346a0fb4e785a994dbd96f916ee29daa2c05704b3ad7d215880bfb90559f1"} Dec 01 20:00:01 crc kubenswrapper[4802]: I1201 20:00:01.341788 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bph4q" podStartSLOduration=3.287275921 podStartE2EDuration="59.341760563s" podCreationTimestamp="2025-12-01 19:59:02 +0000 UTC" firstStartedPulling="2025-12-01 19:59:04.810333202 +0000 UTC m=+166.372892843" lastFinishedPulling="2025-12-01 20:00:00.864817844 +0000 UTC m=+222.427377485" observedRunningTime="2025-12-01 20:00:01.339157752 +0000 UTC m=+222.901717393" watchObservedRunningTime="2025-12-01 20:00:01.341760563 +0000 UTC m=+222.904320214" Dec 01 20:00:01 crc kubenswrapper[4802]: I1201 20:00:01.358965 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c6vvc" podStartSLOduration=2.27281321 podStartE2EDuration="56.358949245s" podCreationTimestamp="2025-12-01 19:59:05 +0000 UTC" firstStartedPulling="2025-12-01 19:59:06.840347629 +0000 UTC m=+168.402907270" lastFinishedPulling="2025-12-01 20:00:00.926483664 +0000 UTC m=+222.489043305" observedRunningTime="2025-12-01 20:00:01.355544249 +0000 UTC m=+222.918103900" watchObservedRunningTime="2025-12-01 20:00:01.358949245 +0000 UTC m=+222.921508886" Dec 01 20:00:01 crc kubenswrapper[4802]: I1201 20:00:01.372145 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29410320-ptlrb" podStartSLOduration=1.372119932 podStartE2EDuration="1.372119932s" podCreationTimestamp="2025-12-01 20:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:00:01.370508812 +0000 UTC m=+222.933068453" watchObservedRunningTime="2025-12-01 20:00:01.372119932 +0000 UTC m=+222.934679583" Dec 01 20:00:01 crc kubenswrapper[4802]: I1201 20:00:01.393363 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k9p7x" podStartSLOduration=2.401242981 podStartE2EDuration="57.393345689s" podCreationTimestamp="2025-12-01 19:59:04 +0000 UTC" firstStartedPulling="2025-12-01 19:59:05.828040023 +0000 UTC m=+167.390599654" lastFinishedPulling="2025-12-01 20:00:00.820142721 +0000 UTC m=+222.382702362" observedRunningTime="2025-12-01 20:00:01.391500972 +0000 UTC m=+222.954060623" watchObservedRunningTime="2025-12-01 20:00:01.393345689 +0000 UTC m=+222.955905330" Dec 01 20:00:01 crc kubenswrapper[4802]: I1201 20:00:01.447377 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dtjsm" podStartSLOduration=2.437503168 podStartE2EDuration="59.447356902s" podCreationTimestamp="2025-12-01 19:59:02 +0000 UTC" firstStartedPulling="2025-12-01 19:59:03.764359634 +0000 UTC m=+165.326919275" lastFinishedPulling="2025-12-01 20:00:00.774213368 +0000 UTC m=+222.336773009" observedRunningTime="2025-12-01 20:00:01.442818002 +0000 UTC m=+223.005377653" watchObservedRunningTime="2025-12-01 20:00:01.447356902 +0000 UTC m=+223.009916543" Dec 01 20:00:01 crc kubenswrapper[4802]: I1201 20:00:01.466915 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-54h5m" podStartSLOduration=2.348218577 podStartE2EDuration="58.466889497s" podCreationTimestamp="2025-12-01 19:59:03 +0000 UTC" firstStartedPulling="2025-12-01 19:59:04.813088807 +0000 UTC m=+166.375648458" lastFinishedPulling="2025-12-01 20:00:00.931759737 +0000 UTC m=+222.494319378" observedRunningTime="2025-12-01 20:00:01.46471678 +0000 UTC m=+223.027276431" watchObservedRunningTime="2025-12-01 20:00:01.466889497 +0000 UTC m=+223.029449128" Dec 01 20:00:02 crc kubenswrapper[4802]: I1201 20:00:02.332178 4802 generic.go:334] "Generic (PLEG): container finished" podID="7c376edc-1f4d-4651-ad0c-cdb7b1412a6c" containerID="5ccf237f66eb18a6d81651e774b4abc173c2f7cefc2728ef2947d6a60d97b7ca" exitCode=0 Dec 01 20:00:02 crc kubenswrapper[4802]: I1201 20:00:02.332242 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410320-ptlrb" event={"ID":"7c376edc-1f4d-4651-ad0c-cdb7b1412a6c","Type":"ContainerDied","Data":"5ccf237f66eb18a6d81651e774b4abc173c2f7cefc2728ef2947d6a60d97b7ca"} Dec 01 20:00:03 crc kubenswrapper[4802]: I1201 20:00:03.026131 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dtjsm" Dec 01 20:00:03 crc kubenswrapper[4802]: I1201 20:00:03.026474 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dtjsm" Dec 01 20:00:03 crc kubenswrapper[4802]: I1201 20:00:03.072102 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dtjsm" Dec 01 20:00:03 crc kubenswrapper[4802]: I1201 20:00:03.244358 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bph4q" Dec 01 20:00:03 crc kubenswrapper[4802]: I1201 20:00:03.244419 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bph4q" Dec 01 20:00:03 crc kubenswrapper[4802]: I1201 20:00:03.287265 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bph4q" Dec 01 20:00:03 crc kubenswrapper[4802]: I1201 20:00:03.422584 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qvm7z" Dec 01 20:00:03 crc kubenswrapper[4802]: I1201 20:00:03.422617 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qvm7z" Dec 01 20:00:03 crc kubenswrapper[4802]: I1201 20:00:03.485523 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qvm7z" Dec 01 20:00:03 crc kubenswrapper[4802]: I1201 20:00:03.601653 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410320-ptlrb" Dec 01 20:00:03 crc kubenswrapper[4802]: I1201 20:00:03.612425 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-54h5m" Dec 01 20:00:03 crc kubenswrapper[4802]: I1201 20:00:03.612482 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-54h5m" Dec 01 20:00:03 crc kubenswrapper[4802]: I1201 20:00:03.667507 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-54h5m" Dec 01 20:00:03 crc kubenswrapper[4802]: I1201 20:00:03.670375 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsjxh\" (UniqueName: \"kubernetes.io/projected/7c376edc-1f4d-4651-ad0c-cdb7b1412a6c-kube-api-access-tsjxh\") pod \"7c376edc-1f4d-4651-ad0c-cdb7b1412a6c\" (UID: \"7c376edc-1f4d-4651-ad0c-cdb7b1412a6c\") " Dec 01 20:00:03 crc kubenswrapper[4802]: I1201 20:00:03.670524 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c376edc-1f4d-4651-ad0c-cdb7b1412a6c-secret-volume\") pod \"7c376edc-1f4d-4651-ad0c-cdb7b1412a6c\" (UID: \"7c376edc-1f4d-4651-ad0c-cdb7b1412a6c\") " Dec 01 20:00:03 crc kubenswrapper[4802]: I1201 20:00:03.670615 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c376edc-1f4d-4651-ad0c-cdb7b1412a6c-config-volume\") pod \"7c376edc-1f4d-4651-ad0c-cdb7b1412a6c\" (UID: \"7c376edc-1f4d-4651-ad0c-cdb7b1412a6c\") " Dec 01 20:00:03 crc kubenswrapper[4802]: I1201 20:00:03.671257 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c376edc-1f4d-4651-ad0c-cdb7b1412a6c-config-volume" (OuterVolumeSpecName: "config-volume") pod "7c376edc-1f4d-4651-ad0c-cdb7b1412a6c" (UID: "7c376edc-1f4d-4651-ad0c-cdb7b1412a6c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:00:03 crc kubenswrapper[4802]: I1201 20:00:03.677789 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c376edc-1f4d-4651-ad0c-cdb7b1412a6c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7c376edc-1f4d-4651-ad0c-cdb7b1412a6c" (UID: "7c376edc-1f4d-4651-ad0c-cdb7b1412a6c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:00:03 crc kubenswrapper[4802]: I1201 20:00:03.683809 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c376edc-1f4d-4651-ad0c-cdb7b1412a6c-kube-api-access-tsjxh" (OuterVolumeSpecName: "kube-api-access-tsjxh") pod "7c376edc-1f4d-4651-ad0c-cdb7b1412a6c" (UID: "7c376edc-1f4d-4651-ad0c-cdb7b1412a6c"). InnerVolumeSpecName "kube-api-access-tsjxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:00:03 crc kubenswrapper[4802]: I1201 20:00:03.772556 4802 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c376edc-1f4d-4651-ad0c-cdb7b1412a6c-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:03 crc kubenswrapper[4802]: I1201 20:00:03.772598 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsjxh\" (UniqueName: \"kubernetes.io/projected/7c376edc-1f4d-4651-ad0c-cdb7b1412a6c-kube-api-access-tsjxh\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:03 crc kubenswrapper[4802]: I1201 20:00:03.772612 4802 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7c376edc-1f4d-4651-ad0c-cdb7b1412a6c-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:04 crc kubenswrapper[4802]: I1201 20:00:04.344772 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410320-ptlrb" event={"ID":"7c376edc-1f4d-4651-ad0c-cdb7b1412a6c","Type":"ContainerDied","Data":"e8712e27b55ad3a416e36e4a211a9599a2aa383144b918859db88ba5147505a5"} Dec 01 20:00:04 crc kubenswrapper[4802]: I1201 20:00:04.344845 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8712e27b55ad3a416e36e4a211a9599a2aa383144b918859db88ba5147505a5" Dec 01 20:00:04 crc kubenswrapper[4802]: I1201 20:00:04.344916 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410320-ptlrb" Dec 01 20:00:05 crc kubenswrapper[4802]: I1201 20:00:05.226987 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k9p7x" Dec 01 20:00:05 crc kubenswrapper[4802]: I1201 20:00:05.227397 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k9p7x" Dec 01 20:00:05 crc kubenswrapper[4802]: I1201 20:00:05.280638 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k9p7x" Dec 01 20:00:06 crc kubenswrapper[4802]: I1201 20:00:06.224583 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c6vvc" Dec 01 20:00:06 crc kubenswrapper[4802]: I1201 20:00:06.224643 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c6vvc" Dec 01 20:00:06 crc kubenswrapper[4802]: I1201 20:00:06.359532 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jkqxz" event={"ID":"f4850b32-033b-451d-af1d-fad64baead63","Type":"ContainerStarted","Data":"0f6300406917753097bd2fa047fd223becc89f4d3570036f7860aa9a73bc023d"} Dec 01 20:00:06 crc kubenswrapper[4802]: I1201 20:00:06.378251 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jkqxz" podStartSLOduration=3.608273771 podStartE2EDuration="1m1.378227839s" podCreationTimestamp="2025-12-01 19:59:05 +0000 UTC" firstStartedPulling="2025-12-01 19:59:06.85427135 +0000 UTC m=+168.416830991" lastFinishedPulling="2025-12-01 20:00:04.624225408 +0000 UTC m=+226.186785059" observedRunningTime="2025-12-01 20:00:06.375779563 +0000 UTC m=+227.938339224" watchObservedRunningTime="2025-12-01 20:00:06.378227839 +0000 UTC m=+227.940787480" Dec 01 20:00:07 crc kubenswrapper[4802]: I1201 20:00:07.265968 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c6vvc" podUID="2f8519d9-5b33-4c4d-b430-6497b8bcc71b" containerName="registry-server" probeResult="failure" output=< Dec 01 20:00:07 crc kubenswrapper[4802]: timeout: failed to connect service ":50051" within 1s Dec 01 20:00:07 crc kubenswrapper[4802]: > Dec 01 20:00:13 crc kubenswrapper[4802]: I1201 20:00:13.064578 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dtjsm" Dec 01 20:00:13 crc kubenswrapper[4802]: I1201 20:00:13.280999 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bph4q" Dec 01 20:00:13 crc kubenswrapper[4802]: I1201 20:00:13.466433 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qvm7z" Dec 01 20:00:13 crc kubenswrapper[4802]: I1201 20:00:13.646578 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-54h5m" Dec 01 20:00:14 crc kubenswrapper[4802]: I1201 20:00:14.897435 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qvm7z"] Dec 01 20:00:14 crc kubenswrapper[4802]: I1201 20:00:14.897869 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qvm7z" podUID="225bc123-af5b-421f-b8e4-a3a13ce3ee58" containerName="registry-server" containerID="cri-o://81563d6cdb065ba58b870d6450951c9ea69a23e64e7e3017824d8dc41342a8a9" gracePeriod=2 Dec 01 20:00:15 crc kubenswrapper[4802]: I1201 20:00:15.278576 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k9p7x" Dec 01 20:00:15 crc kubenswrapper[4802]: I1201 20:00:15.412741 4802 generic.go:334] "Generic (PLEG): container finished" podID="225bc123-af5b-421f-b8e4-a3a13ce3ee58" containerID="81563d6cdb065ba58b870d6450951c9ea69a23e64e7e3017824d8dc41342a8a9" exitCode=0 Dec 01 20:00:15 crc kubenswrapper[4802]: I1201 20:00:15.412788 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qvm7z" event={"ID":"225bc123-af5b-421f-b8e4-a3a13ce3ee58","Type":"ContainerDied","Data":"81563d6cdb065ba58b870d6450951c9ea69a23e64e7e3017824d8dc41342a8a9"} Dec 01 20:00:15 crc kubenswrapper[4802]: I1201 20:00:15.618419 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jkqxz" Dec 01 20:00:15 crc kubenswrapper[4802]: I1201 20:00:15.618460 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jkqxz" Dec 01 20:00:15 crc kubenswrapper[4802]: I1201 20:00:15.654638 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jkqxz" Dec 01 20:00:15 crc kubenswrapper[4802]: I1201 20:00:15.893458 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-54h5m"] Dec 01 20:00:15 crc kubenswrapper[4802]: I1201 20:00:15.893782 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-54h5m" podUID="edeba60a-7f88-48b8-a016-324a7527a666" containerName="registry-server" containerID="cri-o://6720324e6ea1ae4a86167f80a8e212be83e537880e9604d9d5964e1e1616bf11" gracePeriod=2 Dec 01 20:00:16 crc kubenswrapper[4802]: I1201 20:00:16.294589 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c6vvc" Dec 01 20:00:16 crc kubenswrapper[4802]: I1201 20:00:16.340988 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c6vvc" Dec 01 20:00:16 crc kubenswrapper[4802]: I1201 20:00:16.466973 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jkqxz" Dec 01 20:00:17 crc kubenswrapper[4802]: I1201 20:00:17.065895 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qvm7z" Dec 01 20:00:17 crc kubenswrapper[4802]: I1201 20:00:17.152367 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/225bc123-af5b-421f-b8e4-a3a13ce3ee58-catalog-content\") pod \"225bc123-af5b-421f-b8e4-a3a13ce3ee58\" (UID: \"225bc123-af5b-421f-b8e4-a3a13ce3ee58\") " Dec 01 20:00:17 crc kubenswrapper[4802]: I1201 20:00:17.152515 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh59m\" (UniqueName: \"kubernetes.io/projected/225bc123-af5b-421f-b8e4-a3a13ce3ee58-kube-api-access-gh59m\") pod \"225bc123-af5b-421f-b8e4-a3a13ce3ee58\" (UID: \"225bc123-af5b-421f-b8e4-a3a13ce3ee58\") " Dec 01 20:00:17 crc kubenswrapper[4802]: I1201 20:00:17.152622 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/225bc123-af5b-421f-b8e4-a3a13ce3ee58-utilities\") pod \"225bc123-af5b-421f-b8e4-a3a13ce3ee58\" (UID: \"225bc123-af5b-421f-b8e4-a3a13ce3ee58\") " Dec 01 20:00:17 crc kubenswrapper[4802]: I1201 20:00:17.153515 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/225bc123-af5b-421f-b8e4-a3a13ce3ee58-utilities" (OuterVolumeSpecName: "utilities") pod "225bc123-af5b-421f-b8e4-a3a13ce3ee58" (UID: "225bc123-af5b-421f-b8e4-a3a13ce3ee58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:00:17 crc kubenswrapper[4802]: I1201 20:00:17.162876 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/225bc123-af5b-421f-b8e4-a3a13ce3ee58-kube-api-access-gh59m" (OuterVolumeSpecName: "kube-api-access-gh59m") pod "225bc123-af5b-421f-b8e4-a3a13ce3ee58" (UID: "225bc123-af5b-421f-b8e4-a3a13ce3ee58"). InnerVolumeSpecName "kube-api-access-gh59m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:00:17 crc kubenswrapper[4802]: I1201 20:00:17.196969 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/225bc123-af5b-421f-b8e4-a3a13ce3ee58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "225bc123-af5b-421f-b8e4-a3a13ce3ee58" (UID: "225bc123-af5b-421f-b8e4-a3a13ce3ee58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:00:17 crc kubenswrapper[4802]: I1201 20:00:17.254455 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh59m\" (UniqueName: \"kubernetes.io/projected/225bc123-af5b-421f-b8e4-a3a13ce3ee58-kube-api-access-gh59m\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:17 crc kubenswrapper[4802]: I1201 20:00:17.254490 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/225bc123-af5b-421f-b8e4-a3a13ce3ee58-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:17 crc kubenswrapper[4802]: I1201 20:00:17.254500 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/225bc123-af5b-421f-b8e4-a3a13ce3ee58-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:17 crc kubenswrapper[4802]: I1201 20:00:17.424337 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qvm7z" event={"ID":"225bc123-af5b-421f-b8e4-a3a13ce3ee58","Type":"ContainerDied","Data":"037873669d53b3167762ad610934e99145230631e8ecc9217d070908c767ddf1"} Dec 01 20:00:17 crc kubenswrapper[4802]: I1201 20:00:17.424420 4802 scope.go:117] "RemoveContainer" containerID="81563d6cdb065ba58b870d6450951c9ea69a23e64e7e3017824d8dc41342a8a9" Dec 01 20:00:17 crc kubenswrapper[4802]: I1201 20:00:17.424362 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qvm7z" Dec 01 20:00:17 crc kubenswrapper[4802]: I1201 20:00:17.439329 4802 scope.go:117] "RemoveContainer" containerID="90cb9bc3039e0bcdc2af61aef1c2c68a587ebc280471bf5baa80ad774774de95" Dec 01 20:00:17 crc kubenswrapper[4802]: I1201 20:00:17.457167 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qvm7z"] Dec 01 20:00:17 crc kubenswrapper[4802]: I1201 20:00:17.457506 4802 scope.go:117] "RemoveContainer" containerID="67c29fdccf17355015b44bd47bf8e4a5595a74382866ce1f2ec4003afb6f14b2" Dec 01 20:00:17 crc kubenswrapper[4802]: I1201 20:00:17.462410 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qvm7z"] Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.224725 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-54h5m" Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.269073 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edeba60a-7f88-48b8-a016-324a7527a666-utilities\") pod \"edeba60a-7f88-48b8-a016-324a7527a666\" (UID: \"edeba60a-7f88-48b8-a016-324a7527a666\") " Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.269214 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edeba60a-7f88-48b8-a016-324a7527a666-catalog-content\") pod \"edeba60a-7f88-48b8-a016-324a7527a666\" (UID: \"edeba60a-7f88-48b8-a016-324a7527a666\") " Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.269371 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n56l\" (UniqueName: \"kubernetes.io/projected/edeba60a-7f88-48b8-a016-324a7527a666-kube-api-access-9n56l\") pod \"edeba60a-7f88-48b8-a016-324a7527a666\" (UID: \"edeba60a-7f88-48b8-a016-324a7527a666\") " Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.270451 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edeba60a-7f88-48b8-a016-324a7527a666-utilities" (OuterVolumeSpecName: "utilities") pod "edeba60a-7f88-48b8-a016-324a7527a666" (UID: "edeba60a-7f88-48b8-a016-324a7527a666"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.272850 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edeba60a-7f88-48b8-a016-324a7527a666-kube-api-access-9n56l" (OuterVolumeSpecName: "kube-api-access-9n56l") pod "edeba60a-7f88-48b8-a016-324a7527a666" (UID: "edeba60a-7f88-48b8-a016-324a7527a666"). InnerVolumeSpecName "kube-api-access-9n56l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.295443 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jkqxz"] Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.322186 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edeba60a-7f88-48b8-a016-324a7527a666-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "edeba60a-7f88-48b8-a016-324a7527a666" (UID: "edeba60a-7f88-48b8-a016-324a7527a666"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.372308 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n56l\" (UniqueName: \"kubernetes.io/projected/edeba60a-7f88-48b8-a016-324a7527a666-kube-api-access-9n56l\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.372417 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edeba60a-7f88-48b8-a016-324a7527a666-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.372438 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edeba60a-7f88-48b8-a016-324a7527a666-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.433383 4802 generic.go:334] "Generic (PLEG): container finished" podID="edeba60a-7f88-48b8-a016-324a7527a666" containerID="6720324e6ea1ae4a86167f80a8e212be83e537880e9604d9d5964e1e1616bf11" exitCode=0 Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.433437 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54h5m" event={"ID":"edeba60a-7f88-48b8-a016-324a7527a666","Type":"ContainerDied","Data":"6720324e6ea1ae4a86167f80a8e212be83e537880e9604d9d5964e1e1616bf11"} Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.433504 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54h5m" event={"ID":"edeba60a-7f88-48b8-a016-324a7527a666","Type":"ContainerDied","Data":"114cf08cddfd3ba08b7f0c3f3819be251a199fb2cbf5cfd9956e6ff7811da883"} Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.433532 4802 scope.go:117] "RemoveContainer" containerID="6720324e6ea1ae4a86167f80a8e212be83e537880e9604d9d5964e1e1616bf11" Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.433542 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-54h5m" Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.433652 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jkqxz" podUID="f4850b32-033b-451d-af1d-fad64baead63" containerName="registry-server" containerID="cri-o://0f6300406917753097bd2fa047fd223becc89f4d3570036f7860aa9a73bc023d" gracePeriod=2 Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.451658 4802 scope.go:117] "RemoveContainer" containerID="d8ba24a487659272db48d50f59d5c0e8bc9e92a8e69a1a079f417fb2ee24ec77" Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.472356 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-54h5m"] Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.477657 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-54h5m"] Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.486616 4802 scope.go:117] "RemoveContainer" containerID="93b6c4b5aac91ea67cd4b1d3e3c5dd4b111e8dc0fc60ec0d4a583bd315a9b2b1" Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.537256 4802 scope.go:117] "RemoveContainer" containerID="6720324e6ea1ae4a86167f80a8e212be83e537880e9604d9d5964e1e1616bf11" Dec 01 20:00:18 crc kubenswrapper[4802]: E1201 20:00:18.537856 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6720324e6ea1ae4a86167f80a8e212be83e537880e9604d9d5964e1e1616bf11\": container with ID starting with 6720324e6ea1ae4a86167f80a8e212be83e537880e9604d9d5964e1e1616bf11 not found: ID does not exist" containerID="6720324e6ea1ae4a86167f80a8e212be83e537880e9604d9d5964e1e1616bf11" Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.537985 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6720324e6ea1ae4a86167f80a8e212be83e537880e9604d9d5964e1e1616bf11"} err="failed to get container status \"6720324e6ea1ae4a86167f80a8e212be83e537880e9604d9d5964e1e1616bf11\": rpc error: code = NotFound desc = could not find container \"6720324e6ea1ae4a86167f80a8e212be83e537880e9604d9d5964e1e1616bf11\": container with ID starting with 6720324e6ea1ae4a86167f80a8e212be83e537880e9604d9d5964e1e1616bf11 not found: ID does not exist" Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.538076 4802 scope.go:117] "RemoveContainer" containerID="d8ba24a487659272db48d50f59d5c0e8bc9e92a8e69a1a079f417fb2ee24ec77" Dec 01 20:00:18 crc kubenswrapper[4802]: E1201 20:00:18.538548 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8ba24a487659272db48d50f59d5c0e8bc9e92a8e69a1a079f417fb2ee24ec77\": container with ID starting with d8ba24a487659272db48d50f59d5c0e8bc9e92a8e69a1a079f417fb2ee24ec77 not found: ID does not exist" containerID="d8ba24a487659272db48d50f59d5c0e8bc9e92a8e69a1a079f417fb2ee24ec77" Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.538650 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8ba24a487659272db48d50f59d5c0e8bc9e92a8e69a1a079f417fb2ee24ec77"} err="failed to get container status \"d8ba24a487659272db48d50f59d5c0e8bc9e92a8e69a1a079f417fb2ee24ec77\": rpc error: code = NotFound desc = could not find container \"d8ba24a487659272db48d50f59d5c0e8bc9e92a8e69a1a079f417fb2ee24ec77\": container with ID starting with d8ba24a487659272db48d50f59d5c0e8bc9e92a8e69a1a079f417fb2ee24ec77 not found: ID does not exist" Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.538716 4802 scope.go:117] "RemoveContainer" containerID="93b6c4b5aac91ea67cd4b1d3e3c5dd4b111e8dc0fc60ec0d4a583bd315a9b2b1" Dec 01 20:00:18 crc kubenswrapper[4802]: E1201 20:00:18.539070 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93b6c4b5aac91ea67cd4b1d3e3c5dd4b111e8dc0fc60ec0d4a583bd315a9b2b1\": container with ID starting with 93b6c4b5aac91ea67cd4b1d3e3c5dd4b111e8dc0fc60ec0d4a583bd315a9b2b1 not found: ID does not exist" containerID="93b6c4b5aac91ea67cd4b1d3e3c5dd4b111e8dc0fc60ec0d4a583bd315a9b2b1" Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.539112 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93b6c4b5aac91ea67cd4b1d3e3c5dd4b111e8dc0fc60ec0d4a583bd315a9b2b1"} err="failed to get container status \"93b6c4b5aac91ea67cd4b1d3e3c5dd4b111e8dc0fc60ec0d4a583bd315a9b2b1\": rpc error: code = NotFound desc = could not find container \"93b6c4b5aac91ea67cd4b1d3e3c5dd4b111e8dc0fc60ec0d4a583bd315a9b2b1\": container with ID starting with 93b6c4b5aac91ea67cd4b1d3e3c5dd4b111e8dc0fc60ec0d4a583bd315a9b2b1 not found: ID does not exist" Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.726091 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="225bc123-af5b-421f-b8e4-a3a13ce3ee58" path="/var/lib/kubelet/pods/225bc123-af5b-421f-b8e4-a3a13ce3ee58/volumes" Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.727074 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edeba60a-7f88-48b8-a016-324a7527a666" path="/var/lib/kubelet/pods/edeba60a-7f88-48b8-a016-324a7527a666/volumes" Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.763650 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jkqxz" Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.878343 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4850b32-033b-451d-af1d-fad64baead63-utilities\") pod \"f4850b32-033b-451d-af1d-fad64baead63\" (UID: \"f4850b32-033b-451d-af1d-fad64baead63\") " Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.878473 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9z65l\" (UniqueName: \"kubernetes.io/projected/f4850b32-033b-451d-af1d-fad64baead63-kube-api-access-9z65l\") pod \"f4850b32-033b-451d-af1d-fad64baead63\" (UID: \"f4850b32-033b-451d-af1d-fad64baead63\") " Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.878512 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4850b32-033b-451d-af1d-fad64baead63-catalog-content\") pod \"f4850b32-033b-451d-af1d-fad64baead63\" (UID: \"f4850b32-033b-451d-af1d-fad64baead63\") " Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.879487 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4850b32-033b-451d-af1d-fad64baead63-utilities" (OuterVolumeSpecName: "utilities") pod "f4850b32-033b-451d-af1d-fad64baead63" (UID: "f4850b32-033b-451d-af1d-fad64baead63"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.884466 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4850b32-033b-451d-af1d-fad64baead63-kube-api-access-9z65l" (OuterVolumeSpecName: "kube-api-access-9z65l") pod "f4850b32-033b-451d-af1d-fad64baead63" (UID: "f4850b32-033b-451d-af1d-fad64baead63"). InnerVolumeSpecName "kube-api-access-9z65l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.906322 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4850b32-033b-451d-af1d-fad64baead63-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4850b32-033b-451d-af1d-fad64baead63" (UID: "f4850b32-033b-451d-af1d-fad64baead63"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.979880 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4850b32-033b-451d-af1d-fad64baead63-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.979916 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9z65l\" (UniqueName: \"kubernetes.io/projected/f4850b32-033b-451d-af1d-fad64baead63-kube-api-access-9z65l\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:18 crc kubenswrapper[4802]: I1201 20:00:18.979931 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4850b32-033b-451d-af1d-fad64baead63-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:19 crc kubenswrapper[4802]: I1201 20:00:19.441319 4802 generic.go:334] "Generic (PLEG): container finished" podID="f4850b32-033b-451d-af1d-fad64baead63" containerID="0f6300406917753097bd2fa047fd223becc89f4d3570036f7860aa9a73bc023d" exitCode=0 Dec 01 20:00:19 crc kubenswrapper[4802]: I1201 20:00:19.441362 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jkqxz" event={"ID":"f4850b32-033b-451d-af1d-fad64baead63","Type":"ContainerDied","Data":"0f6300406917753097bd2fa047fd223becc89f4d3570036f7860aa9a73bc023d"} Dec 01 20:00:19 crc kubenswrapper[4802]: I1201 20:00:19.441391 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jkqxz" event={"ID":"f4850b32-033b-451d-af1d-fad64baead63","Type":"ContainerDied","Data":"fdad27d6cfbc5899c9045dcb2de2da6544d188c7233c6dbdcf117c629ab92a7f"} Dec 01 20:00:19 crc kubenswrapper[4802]: I1201 20:00:19.441413 4802 scope.go:117] "RemoveContainer" containerID="0f6300406917753097bd2fa047fd223becc89f4d3570036f7860aa9a73bc023d" Dec 01 20:00:19 crc kubenswrapper[4802]: I1201 20:00:19.442187 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jkqxz" Dec 01 20:00:19 crc kubenswrapper[4802]: I1201 20:00:19.464084 4802 scope.go:117] "RemoveContainer" containerID="fd3b7a9e8c6376295ceecb346b0e0989878127ad9b8003efec71409f710dd046" Dec 01 20:00:19 crc kubenswrapper[4802]: I1201 20:00:19.483480 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jkqxz"] Dec 01 20:00:19 crc kubenswrapper[4802]: I1201 20:00:19.491378 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jkqxz"] Dec 01 20:00:19 crc kubenswrapper[4802]: I1201 20:00:19.503039 4802 scope.go:117] "RemoveContainer" containerID="de48f2e40f103e27b77d9feb8f8b29a5485babcb008e1249f5e4093ee4749e6d" Dec 01 20:00:19 crc kubenswrapper[4802]: I1201 20:00:19.518674 4802 scope.go:117] "RemoveContainer" containerID="0f6300406917753097bd2fa047fd223becc89f4d3570036f7860aa9a73bc023d" Dec 01 20:00:19 crc kubenswrapper[4802]: E1201 20:00:19.519117 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f6300406917753097bd2fa047fd223becc89f4d3570036f7860aa9a73bc023d\": container with ID starting with 0f6300406917753097bd2fa047fd223becc89f4d3570036f7860aa9a73bc023d not found: ID does not exist" containerID="0f6300406917753097bd2fa047fd223becc89f4d3570036f7860aa9a73bc023d" Dec 01 20:00:19 crc kubenswrapper[4802]: I1201 20:00:19.519235 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f6300406917753097bd2fa047fd223becc89f4d3570036f7860aa9a73bc023d"} err="failed to get container status \"0f6300406917753097bd2fa047fd223becc89f4d3570036f7860aa9a73bc023d\": rpc error: code = NotFound desc = could not find container \"0f6300406917753097bd2fa047fd223becc89f4d3570036f7860aa9a73bc023d\": container with ID starting with 0f6300406917753097bd2fa047fd223becc89f4d3570036f7860aa9a73bc023d not found: ID does not exist" Dec 01 20:00:19 crc kubenswrapper[4802]: I1201 20:00:19.519329 4802 scope.go:117] "RemoveContainer" containerID="fd3b7a9e8c6376295ceecb346b0e0989878127ad9b8003efec71409f710dd046" Dec 01 20:00:19 crc kubenswrapper[4802]: E1201 20:00:19.519871 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd3b7a9e8c6376295ceecb346b0e0989878127ad9b8003efec71409f710dd046\": container with ID starting with fd3b7a9e8c6376295ceecb346b0e0989878127ad9b8003efec71409f710dd046 not found: ID does not exist" containerID="fd3b7a9e8c6376295ceecb346b0e0989878127ad9b8003efec71409f710dd046" Dec 01 20:00:19 crc kubenswrapper[4802]: I1201 20:00:19.519937 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd3b7a9e8c6376295ceecb346b0e0989878127ad9b8003efec71409f710dd046"} err="failed to get container status \"fd3b7a9e8c6376295ceecb346b0e0989878127ad9b8003efec71409f710dd046\": rpc error: code = NotFound desc = could not find container \"fd3b7a9e8c6376295ceecb346b0e0989878127ad9b8003efec71409f710dd046\": container with ID starting with fd3b7a9e8c6376295ceecb346b0e0989878127ad9b8003efec71409f710dd046 not found: ID does not exist" Dec 01 20:00:19 crc kubenswrapper[4802]: I1201 20:00:19.519983 4802 scope.go:117] "RemoveContainer" containerID="de48f2e40f103e27b77d9feb8f8b29a5485babcb008e1249f5e4093ee4749e6d" Dec 01 20:00:19 crc kubenswrapper[4802]: E1201 20:00:19.520437 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de48f2e40f103e27b77d9feb8f8b29a5485babcb008e1249f5e4093ee4749e6d\": container with ID starting with de48f2e40f103e27b77d9feb8f8b29a5485babcb008e1249f5e4093ee4749e6d not found: ID does not exist" containerID="de48f2e40f103e27b77d9feb8f8b29a5485babcb008e1249f5e4093ee4749e6d" Dec 01 20:00:19 crc kubenswrapper[4802]: I1201 20:00:19.520503 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de48f2e40f103e27b77d9feb8f8b29a5485babcb008e1249f5e4093ee4749e6d"} err="failed to get container status \"de48f2e40f103e27b77d9feb8f8b29a5485babcb008e1249f5e4093ee4749e6d\": rpc error: code = NotFound desc = could not find container \"de48f2e40f103e27b77d9feb8f8b29a5485babcb008e1249f5e4093ee4749e6d\": container with ID starting with de48f2e40f103e27b77d9feb8f8b29a5485babcb008e1249f5e4093ee4749e6d not found: ID does not exist" Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.495021 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" podUID="fcf38656-7a15-4cd1-9038-83272327ce3c" containerName="oauth-openshift" containerID="cri-o://009a6bd806ef8a7c973e3ef5ba7b47df6869c43034f6de3d94b768caa6f7db72" gracePeriod=15 Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.728006 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4850b32-033b-451d-af1d-fad64baead63" path="/var/lib/kubelet/pods/f4850b32-033b-451d-af1d-fad64baead63/volumes" Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.822563 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.906830 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-user-template-login\") pod \"fcf38656-7a15-4cd1-9038-83272327ce3c\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.906893 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-ocp-branding-template\") pod \"fcf38656-7a15-4cd1-9038-83272327ce3c\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.906919 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zllm9\" (UniqueName: \"kubernetes.io/projected/fcf38656-7a15-4cd1-9038-83272327ce3c-kube-api-access-zllm9\") pod \"fcf38656-7a15-4cd1-9038-83272327ce3c\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.906969 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-user-template-error\") pod \"fcf38656-7a15-4cd1-9038-83272327ce3c\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.906997 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-session\") pod \"fcf38656-7a15-4cd1-9038-83272327ce3c\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.907020 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fcf38656-7a15-4cd1-9038-83272327ce3c-audit-dir\") pod \"fcf38656-7a15-4cd1-9038-83272327ce3c\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.907043 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-cliconfig\") pod \"fcf38656-7a15-4cd1-9038-83272327ce3c\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.907068 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-service-ca\") pod \"fcf38656-7a15-4cd1-9038-83272327ce3c\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.907103 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fcf38656-7a15-4cd1-9038-83272327ce3c-audit-policies\") pod \"fcf38656-7a15-4cd1-9038-83272327ce3c\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.907096 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fcf38656-7a15-4cd1-9038-83272327ce3c-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "fcf38656-7a15-4cd1-9038-83272327ce3c" (UID: "fcf38656-7a15-4cd1-9038-83272327ce3c"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.907124 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-serving-cert\") pod \"fcf38656-7a15-4cd1-9038-83272327ce3c\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.907163 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-router-certs\") pod \"fcf38656-7a15-4cd1-9038-83272327ce3c\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.907211 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-user-idp-0-file-data\") pod \"fcf38656-7a15-4cd1-9038-83272327ce3c\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.907235 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-trusted-ca-bundle\") pod \"fcf38656-7a15-4cd1-9038-83272327ce3c\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.907256 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-user-template-provider-selection\") pod \"fcf38656-7a15-4cd1-9038-83272327ce3c\" (UID: \"fcf38656-7a15-4cd1-9038-83272327ce3c\") " Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.907482 4802 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fcf38656-7a15-4cd1-9038-83272327ce3c-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.907799 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "fcf38656-7a15-4cd1-9038-83272327ce3c" (UID: "fcf38656-7a15-4cd1-9038-83272327ce3c"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.907807 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "fcf38656-7a15-4cd1-9038-83272327ce3c" (UID: "fcf38656-7a15-4cd1-9038-83272327ce3c"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.908130 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcf38656-7a15-4cd1-9038-83272327ce3c-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "fcf38656-7a15-4cd1-9038-83272327ce3c" (UID: "fcf38656-7a15-4cd1-9038-83272327ce3c"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.908249 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "fcf38656-7a15-4cd1-9038-83272327ce3c" (UID: "fcf38656-7a15-4cd1-9038-83272327ce3c"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.912638 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcf38656-7a15-4cd1-9038-83272327ce3c-kube-api-access-zllm9" (OuterVolumeSpecName: "kube-api-access-zllm9") pod "fcf38656-7a15-4cd1-9038-83272327ce3c" (UID: "fcf38656-7a15-4cd1-9038-83272327ce3c"). InnerVolumeSpecName "kube-api-access-zllm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.912680 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "fcf38656-7a15-4cd1-9038-83272327ce3c" (UID: "fcf38656-7a15-4cd1-9038-83272327ce3c"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.912893 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "fcf38656-7a15-4cd1-9038-83272327ce3c" (UID: "fcf38656-7a15-4cd1-9038-83272327ce3c"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.913068 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "fcf38656-7a15-4cd1-9038-83272327ce3c" (UID: "fcf38656-7a15-4cd1-9038-83272327ce3c"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.913476 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "fcf38656-7a15-4cd1-9038-83272327ce3c" (UID: "fcf38656-7a15-4cd1-9038-83272327ce3c"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.913603 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "fcf38656-7a15-4cd1-9038-83272327ce3c" (UID: "fcf38656-7a15-4cd1-9038-83272327ce3c"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.913613 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "fcf38656-7a15-4cd1-9038-83272327ce3c" (UID: "fcf38656-7a15-4cd1-9038-83272327ce3c"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.913842 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "fcf38656-7a15-4cd1-9038-83272327ce3c" (UID: "fcf38656-7a15-4cd1-9038-83272327ce3c"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:00:20 crc kubenswrapper[4802]: I1201 20:00:20.915006 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "fcf38656-7a15-4cd1-9038-83272327ce3c" (UID: "fcf38656-7a15-4cd1-9038-83272327ce3c"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:00:21 crc kubenswrapper[4802]: I1201 20:00:21.008673 4802 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:21 crc kubenswrapper[4802]: I1201 20:00:21.008719 4802 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:21 crc kubenswrapper[4802]: I1201 20:00:21.008730 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zllm9\" (UniqueName: \"kubernetes.io/projected/fcf38656-7a15-4cd1-9038-83272327ce3c-kube-api-access-zllm9\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:21 crc kubenswrapper[4802]: I1201 20:00:21.008741 4802 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:21 crc kubenswrapper[4802]: I1201 20:00:21.008754 4802 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:21 crc kubenswrapper[4802]: I1201 20:00:21.008763 4802 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:21 crc kubenswrapper[4802]: I1201 20:00:21.008774 4802 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:21 crc kubenswrapper[4802]: I1201 20:00:21.008784 4802 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fcf38656-7a15-4cd1-9038-83272327ce3c-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:21 crc kubenswrapper[4802]: I1201 20:00:21.008792 4802 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:21 crc kubenswrapper[4802]: I1201 20:00:21.008802 4802 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:21 crc kubenswrapper[4802]: I1201 20:00:21.008811 4802 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:21 crc kubenswrapper[4802]: I1201 20:00:21.008820 4802 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:21 crc kubenswrapper[4802]: I1201 20:00:21.008830 4802 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fcf38656-7a15-4cd1-9038-83272327ce3c-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:21 crc kubenswrapper[4802]: I1201 20:00:21.454740 4802 generic.go:334] "Generic (PLEG): container finished" podID="fcf38656-7a15-4cd1-9038-83272327ce3c" containerID="009a6bd806ef8a7c973e3ef5ba7b47df6869c43034f6de3d94b768caa6f7db72" exitCode=0 Dec 01 20:00:21 crc kubenswrapper[4802]: I1201 20:00:21.454812 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" Dec 01 20:00:21 crc kubenswrapper[4802]: I1201 20:00:21.454807 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" event={"ID":"fcf38656-7a15-4cd1-9038-83272327ce3c","Type":"ContainerDied","Data":"009a6bd806ef8a7c973e3ef5ba7b47df6869c43034f6de3d94b768caa6f7db72"} Dec 01 20:00:21 crc kubenswrapper[4802]: I1201 20:00:21.454885 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-nsqcw" event={"ID":"fcf38656-7a15-4cd1-9038-83272327ce3c","Type":"ContainerDied","Data":"33d7f8738f9f3ca5cd99f028801fa9a18310a1a2043316cf5360f850c2491ca9"} Dec 01 20:00:21 crc kubenswrapper[4802]: I1201 20:00:21.454913 4802 scope.go:117] "RemoveContainer" containerID="009a6bd806ef8a7c973e3ef5ba7b47df6869c43034f6de3d94b768caa6f7db72" Dec 01 20:00:21 crc kubenswrapper[4802]: I1201 20:00:21.478344 4802 scope.go:117] "RemoveContainer" containerID="009a6bd806ef8a7c973e3ef5ba7b47df6869c43034f6de3d94b768caa6f7db72" Dec 01 20:00:21 crc kubenswrapper[4802]: E1201 20:00:21.478844 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"009a6bd806ef8a7c973e3ef5ba7b47df6869c43034f6de3d94b768caa6f7db72\": container with ID starting with 009a6bd806ef8a7c973e3ef5ba7b47df6869c43034f6de3d94b768caa6f7db72 not found: ID does not exist" containerID="009a6bd806ef8a7c973e3ef5ba7b47df6869c43034f6de3d94b768caa6f7db72" Dec 01 20:00:21 crc kubenswrapper[4802]: I1201 20:00:21.478887 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"009a6bd806ef8a7c973e3ef5ba7b47df6869c43034f6de3d94b768caa6f7db72"} err="failed to get container status \"009a6bd806ef8a7c973e3ef5ba7b47df6869c43034f6de3d94b768caa6f7db72\": rpc error: code = NotFound desc = could not find container \"009a6bd806ef8a7c973e3ef5ba7b47df6869c43034f6de3d94b768caa6f7db72\": container with ID starting with 009a6bd806ef8a7c973e3ef5ba7b47df6869c43034f6de3d94b768caa6f7db72 not found: ID does not exist" Dec 01 20:00:21 crc kubenswrapper[4802]: I1201 20:00:21.489446 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nsqcw"] Dec 01 20:00:21 crc kubenswrapper[4802]: I1201 20:00:21.496945 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nsqcw"] Dec 01 20:00:22 crc kubenswrapper[4802]: I1201 20:00:22.726496 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcf38656-7a15-4cd1-9038-83272327ce3c" path="/var/lib/kubelet/pods/fcf38656-7a15-4cd1-9038-83272327ce3c/volumes" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.158571 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-546468998b-xrp58"] Dec 01 20:00:25 crc kubenswrapper[4802]: E1201 20:00:25.159080 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4850b32-033b-451d-af1d-fad64baead63" containerName="registry-server" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.159095 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4850b32-033b-451d-af1d-fad64baead63" containerName="registry-server" Dec 01 20:00:25 crc kubenswrapper[4802]: E1201 20:00:25.159109 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edeba60a-7f88-48b8-a016-324a7527a666" containerName="registry-server" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.159121 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="edeba60a-7f88-48b8-a016-324a7527a666" containerName="registry-server" Dec 01 20:00:25 crc kubenswrapper[4802]: E1201 20:00:25.159135 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edeba60a-7f88-48b8-a016-324a7527a666" containerName="extract-content" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.159144 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="edeba60a-7f88-48b8-a016-324a7527a666" containerName="extract-content" Dec 01 20:00:25 crc kubenswrapper[4802]: E1201 20:00:25.159155 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="225bc123-af5b-421f-b8e4-a3a13ce3ee58" containerName="extract-content" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.159163 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="225bc123-af5b-421f-b8e4-a3a13ce3ee58" containerName="extract-content" Dec 01 20:00:25 crc kubenswrapper[4802]: E1201 20:00:25.159174 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="225bc123-af5b-421f-b8e4-a3a13ce3ee58" containerName="registry-server" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.159183 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="225bc123-af5b-421f-b8e4-a3a13ce3ee58" containerName="registry-server" Dec 01 20:00:25 crc kubenswrapper[4802]: E1201 20:00:25.159192 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="225bc123-af5b-421f-b8e4-a3a13ce3ee58" containerName="extract-utilities" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.159221 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="225bc123-af5b-421f-b8e4-a3a13ce3ee58" containerName="extract-utilities" Dec 01 20:00:25 crc kubenswrapper[4802]: E1201 20:00:25.159232 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4850b32-033b-451d-af1d-fad64baead63" containerName="extract-utilities" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.159242 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4850b32-033b-451d-af1d-fad64baead63" containerName="extract-utilities" Dec 01 20:00:25 crc kubenswrapper[4802]: E1201 20:00:25.159253 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4850b32-033b-451d-af1d-fad64baead63" containerName="extract-content" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.159261 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4850b32-033b-451d-af1d-fad64baead63" containerName="extract-content" Dec 01 20:00:25 crc kubenswrapper[4802]: E1201 20:00:25.159271 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcf38656-7a15-4cd1-9038-83272327ce3c" containerName="oauth-openshift" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.159279 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf38656-7a15-4cd1-9038-83272327ce3c" containerName="oauth-openshift" Dec 01 20:00:25 crc kubenswrapper[4802]: E1201 20:00:25.159294 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c376edc-1f4d-4651-ad0c-cdb7b1412a6c" containerName="collect-profiles" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.159301 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c376edc-1f4d-4651-ad0c-cdb7b1412a6c" containerName="collect-profiles" Dec 01 20:00:25 crc kubenswrapper[4802]: E1201 20:00:25.159313 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edeba60a-7f88-48b8-a016-324a7527a666" containerName="extract-utilities" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.159322 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="edeba60a-7f88-48b8-a016-324a7527a666" containerName="extract-utilities" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.159464 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcf38656-7a15-4cd1-9038-83272327ce3c" containerName="oauth-openshift" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.159480 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="edeba60a-7f88-48b8-a016-324a7527a666" containerName="registry-server" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.159492 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4850b32-033b-451d-af1d-fad64baead63" containerName="registry-server" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.159508 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c376edc-1f4d-4651-ad0c-cdb7b1412a6c" containerName="collect-profiles" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.159518 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="225bc123-af5b-421f-b8e4-a3a13ce3ee58" containerName="registry-server" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.159960 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.165082 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.167019 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.167848 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.168074 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.168172 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.168152 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.168281 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.168154 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.168447 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.168624 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.169314 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.170028 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.179568 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.185733 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-546468998b-xrp58"] Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.192633 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.194448 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.269541 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-audit-dir\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.269627 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-user-template-error\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.269737 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.269840 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-system-router-certs\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.269891 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-system-session\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.270010 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bmg9\" (UniqueName: \"kubernetes.io/projected/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-kube-api-access-6bmg9\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.270071 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.270155 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.270245 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.270291 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-system-service-ca\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.270324 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-user-template-login\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.270358 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.270383 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-audit-policies\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.270460 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.371855 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.371919 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.371964 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.372001 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-system-service-ca\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.372033 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-user-template-login\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.372058 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.372081 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-audit-policies\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.372101 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.372146 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-audit-dir\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.372169 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-user-template-error\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.372233 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.372261 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-system-router-certs\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.372285 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-system-session\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.372477 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-audit-dir\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.372829 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-system-service-ca\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.372931 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bmg9\" (UniqueName: \"kubernetes.io/projected/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-kube-api-access-6bmg9\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.373170 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.373216 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.373418 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-audit-policies\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.377636 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.377723 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-user-template-error\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.377996 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-user-template-login\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.378325 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.378384 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-system-router-certs\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.378610 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.379743 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.380059 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-v4-0-config-system-session\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.388359 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bmg9\" (UniqueName: \"kubernetes.io/projected/748ad5dd-f3c4-458b-bdc1-b2066a1dfad6-kube-api-access-6bmg9\") pod \"oauth-openshift-546468998b-xrp58\" (UID: \"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6\") " pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.487721 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:25 crc kubenswrapper[4802]: I1201 20:00:25.882352 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-546468998b-xrp58"] Dec 01 20:00:26 crc kubenswrapper[4802]: I1201 20:00:26.482943 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-546468998b-xrp58" event={"ID":"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6","Type":"ContainerStarted","Data":"0611c09472e9df79201e9f3714052bb059d2bad5614d302dfdcebd6c7068ef79"} Dec 01 20:00:27 crc kubenswrapper[4802]: I1201 20:00:27.490036 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-546468998b-xrp58" event={"ID":"748ad5dd-f3c4-458b-bdc1-b2066a1dfad6","Type":"ContainerStarted","Data":"01ef9133afddf7ce97627e5b244c884aa99b18f961872e8c2262e54ea93bb170"} Dec 01 20:00:27 crc kubenswrapper[4802]: I1201 20:00:27.490383 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:27 crc kubenswrapper[4802]: I1201 20:00:27.499044 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-546468998b-xrp58" Dec 01 20:00:27 crc kubenswrapper[4802]: I1201 20:00:27.523145 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-546468998b-xrp58" podStartSLOduration=32.523117722 podStartE2EDuration="32.523117722s" podCreationTimestamp="2025-12-01 19:59:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:00:27.517082725 +0000 UTC m=+249.079642406" watchObservedRunningTime="2025-12-01 20:00:27.523117722 +0000 UTC m=+249.085677403" Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.853828 4802 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.854925 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.855305 4802 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.855838 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9" gracePeriod=15 Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.855880 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2" gracePeriod=15 Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.856015 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d" gracePeriod=15 Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.855998 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd" gracePeriod=15 Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.855998 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c" gracePeriod=15 Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.856756 4802 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 20:00:28 crc kubenswrapper[4802]: E1201 20:00:28.857006 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.857022 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 20:00:28 crc kubenswrapper[4802]: E1201 20:00:28.857036 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.857047 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 20:00:28 crc kubenswrapper[4802]: E1201 20:00:28.857061 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.857070 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 20:00:28 crc kubenswrapper[4802]: E1201 20:00:28.857080 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.857089 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 20:00:28 crc kubenswrapper[4802]: E1201 20:00:28.857104 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.857112 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 01 20:00:28 crc kubenswrapper[4802]: E1201 20:00:28.857122 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.857129 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 20:00:28 crc kubenswrapper[4802]: E1201 20:00:28.857139 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.857146 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.857287 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.857299 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.857311 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.857321 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.857329 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.857344 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.906727 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.931912 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.931999 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.932176 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.932364 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.932405 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.932475 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.932508 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 20:00:28 crc kubenswrapper[4802]: I1201 20:00:28.932582 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.033406 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.033448 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.033477 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.033526 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.033560 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.033559 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.033609 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.033579 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.033640 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.033628 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.033670 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.033719 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.033851 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.033879 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.033907 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.034014 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.204886 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 20:00:29 crc kubenswrapper[4802]: E1201 20:00:29.224913 4802 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.151:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187d2fd6f1b957c0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 20:00:29.22406496 +0000 UTC m=+250.786624651,LastTimestamp:2025-12-01 20:00:29.22406496 +0000 UTC m=+250.786624651,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.504138 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"ab074fe5550cfcf629503b4c2b29a2ac6d48f7b42a91498f2480f174a7844bf9"} Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.504455 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"db529691bf6c8265fc6f7fb80c0214a4a7614ef89722c799860b2fb452de5148"} Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.505031 4802 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.505574 4802 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.507899 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.509484 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.510220 4802 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2" exitCode=0 Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.510244 4802 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd" exitCode=0 Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.510253 4802 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d" exitCode=0 Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.510259 4802 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c" exitCode=2 Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.510317 4802 scope.go:117] "RemoveContainer" containerID="2d3f0ab661b2b309dee3efac3ead39ce6990fa507f3ff98793ba85014615e76a" Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.512126 4802 generic.go:334] "Generic (PLEG): container finished" podID="b5cbd352-e2b4-4650-8265-e7b26b8890b4" containerID="e28f48f879146297575f78dbbce66e47a704c57059f8a4d43ae8cbd659a7f27c" exitCode=0 Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.512182 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b5cbd352-e2b4-4650-8265-e7b26b8890b4","Type":"ContainerDied","Data":"e28f48f879146297575f78dbbce66e47a704c57059f8a4d43ae8cbd659a7f27c"} Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.513728 4802 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.514312 4802 status_manager.go:851] "Failed to get status for pod" podUID="b5cbd352-e2b4-4650-8265-e7b26b8890b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:29 crc kubenswrapper[4802]: I1201 20:00:29.514745 4802 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:30 crc kubenswrapper[4802]: I1201 20:00:30.520583 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 20:00:30 crc kubenswrapper[4802]: E1201 20:00:30.671046 4802 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.151:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187d2fd6f1b957c0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 20:00:29.22406496 +0000 UTC m=+250.786624651,LastTimestamp:2025-12-01 20:00:29.22406496 +0000 UTC m=+250.786624651,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 20:00:30 crc kubenswrapper[4802]: I1201 20:00:30.761084 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 20:00:30 crc kubenswrapper[4802]: I1201 20:00:30.761907 4802 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:30 crc kubenswrapper[4802]: I1201 20:00:30.762517 4802 status_manager.go:851] "Failed to get status for pod" podUID="b5cbd352-e2b4-4650-8265-e7b26b8890b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:30 crc kubenswrapper[4802]: I1201 20:00:30.860068 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5cbd352-e2b4-4650-8265-e7b26b8890b4-kube-api-access\") pod \"b5cbd352-e2b4-4650-8265-e7b26b8890b4\" (UID: \"b5cbd352-e2b4-4650-8265-e7b26b8890b4\") " Dec 01 20:00:30 crc kubenswrapper[4802]: I1201 20:00:30.860255 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b5cbd352-e2b4-4650-8265-e7b26b8890b4-kubelet-dir\") pod \"b5cbd352-e2b4-4650-8265-e7b26b8890b4\" (UID: \"b5cbd352-e2b4-4650-8265-e7b26b8890b4\") " Dec 01 20:00:30 crc kubenswrapper[4802]: I1201 20:00:30.860378 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5cbd352-e2b4-4650-8265-e7b26b8890b4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b5cbd352-e2b4-4650-8265-e7b26b8890b4" (UID: "b5cbd352-e2b4-4650-8265-e7b26b8890b4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:00:30 crc kubenswrapper[4802]: I1201 20:00:30.860435 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b5cbd352-e2b4-4650-8265-e7b26b8890b4-var-lock\") pod \"b5cbd352-e2b4-4650-8265-e7b26b8890b4\" (UID: \"b5cbd352-e2b4-4650-8265-e7b26b8890b4\") " Dec 01 20:00:30 crc kubenswrapper[4802]: I1201 20:00:30.860535 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5cbd352-e2b4-4650-8265-e7b26b8890b4-var-lock" (OuterVolumeSpecName: "var-lock") pod "b5cbd352-e2b4-4650-8265-e7b26b8890b4" (UID: "b5cbd352-e2b4-4650-8265-e7b26b8890b4"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:00:30 crc kubenswrapper[4802]: I1201 20:00:30.860895 4802 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b5cbd352-e2b4-4650-8265-e7b26b8890b4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:30 crc kubenswrapper[4802]: I1201 20:00:30.860918 4802 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b5cbd352-e2b4-4650-8265-e7b26b8890b4-var-lock\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:30 crc kubenswrapper[4802]: I1201 20:00:30.866226 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5cbd352-e2b4-4650-8265-e7b26b8890b4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b5cbd352-e2b4-4650-8265-e7b26b8890b4" (UID: "b5cbd352-e2b4-4650-8265-e7b26b8890b4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:00:30 crc kubenswrapper[4802]: I1201 20:00:30.962846 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5cbd352-e2b4-4650-8265-e7b26b8890b4-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.253740 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.255480 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.256701 4802 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.257429 4802 status_manager.go:851] "Failed to get status for pod" podUID="b5cbd352-e2b4-4650-8265-e7b26b8890b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.257856 4802 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.369064 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.369141 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.369242 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.369473 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.369584 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.369525 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.471880 4802 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.471963 4802 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.471984 4802 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.534783 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.536164 4802 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9" exitCode=0 Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.536373 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.536377 4802 scope.go:117] "RemoveContainer" containerID="6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.539629 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b5cbd352-e2b4-4650-8265-e7b26b8890b4","Type":"ContainerDied","Data":"cb9dcd1e02d623a2c1707b680e1c8c7c850bef81b22789b1daa3dc2fe56717d4"} Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.539682 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb9dcd1e02d623a2c1707b680e1c8c7c850bef81b22789b1daa3dc2fe56717d4" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.539713 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.554963 4802 scope.go:117] "RemoveContainer" containerID="4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.560763 4802 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.561277 4802 status_manager.go:851] "Failed to get status for pod" podUID="b5cbd352-e2b4-4650-8265-e7b26b8890b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.562072 4802 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.562491 4802 status_manager.go:851] "Failed to get status for pod" podUID="b5cbd352-e2b4-4650-8265-e7b26b8890b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.563081 4802 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.563957 4802 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.569722 4802 scope.go:117] "RemoveContainer" containerID="b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.589648 4802 scope.go:117] "RemoveContainer" containerID="d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.606522 4802 scope.go:117] "RemoveContainer" containerID="9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.623377 4802 scope.go:117] "RemoveContainer" containerID="a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.644662 4802 scope.go:117] "RemoveContainer" containerID="6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2" Dec 01 20:00:31 crc kubenswrapper[4802]: E1201 20:00:31.645259 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\": container with ID starting with 6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2 not found: ID does not exist" containerID="6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.645325 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2"} err="failed to get container status \"6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\": rpc error: code = NotFound desc = could not find container \"6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2\": container with ID starting with 6893add3fc3287f5a7f86b5bff54c6a843596180fc08a90d30a2c38dd21d3bd2 not found: ID does not exist" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.645395 4802 scope.go:117] "RemoveContainer" containerID="4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd" Dec 01 20:00:31 crc kubenswrapper[4802]: E1201 20:00:31.645696 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\": container with ID starting with 4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd not found: ID does not exist" containerID="4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.645748 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd"} err="failed to get container status \"4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\": rpc error: code = NotFound desc = could not find container \"4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd\": container with ID starting with 4ce724796298ea5c8758130f688d83171fa1d3c9651a8d9ad970be8d844c43bd not found: ID does not exist" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.645767 4802 scope.go:117] "RemoveContainer" containerID="b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d" Dec 01 20:00:31 crc kubenswrapper[4802]: E1201 20:00:31.646048 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\": container with ID starting with b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d not found: ID does not exist" containerID="b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.646078 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d"} err="failed to get container status \"b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\": rpc error: code = NotFound desc = could not find container \"b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d\": container with ID starting with b3cc6acdf2e175d0b53ec0541b8b48457ba771da77868366a691f7c22643bc6d not found: ID does not exist" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.646102 4802 scope.go:117] "RemoveContainer" containerID="d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c" Dec 01 20:00:31 crc kubenswrapper[4802]: E1201 20:00:31.647345 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\": container with ID starting with d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c not found: ID does not exist" containerID="d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.647407 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c"} err="failed to get container status \"d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\": rpc error: code = NotFound desc = could not find container \"d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c\": container with ID starting with d0ca73dbf7d7e36f2a58614c6a7ebbae728511376a707259dbbe18ecc3f0bb5c not found: ID does not exist" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.647529 4802 scope.go:117] "RemoveContainer" containerID="9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9" Dec 01 20:00:31 crc kubenswrapper[4802]: E1201 20:00:31.647859 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\": container with ID starting with 9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9 not found: ID does not exist" containerID="9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.647891 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9"} err="failed to get container status \"9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\": rpc error: code = NotFound desc = could not find container \"9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9\": container with ID starting with 9f539f1afe74de0bc96cb4efd0818b388ba5a1d2d8d05c023bbc6d0103a3c2c9 not found: ID does not exist" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.647907 4802 scope.go:117] "RemoveContainer" containerID="a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7" Dec 01 20:00:31 crc kubenswrapper[4802]: E1201 20:00:31.648422 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\": container with ID starting with a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7 not found: ID does not exist" containerID="a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7" Dec 01 20:00:31 crc kubenswrapper[4802]: I1201 20:00:31.648510 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7"} err="failed to get container status \"a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\": rpc error: code = NotFound desc = could not find container \"a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7\": container with ID starting with a655980ca314e6ab3d6a082b94ceb21d4b223fc8b36b8b033865c855af9b63a7 not found: ID does not exist" Dec 01 20:00:32 crc kubenswrapper[4802]: I1201 20:00:32.730603 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 01 20:00:34 crc kubenswrapper[4802]: E1201 20:00:34.738309 4802 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:34 crc kubenswrapper[4802]: E1201 20:00:34.739379 4802 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:34 crc kubenswrapper[4802]: E1201 20:00:34.739962 4802 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:34 crc kubenswrapper[4802]: E1201 20:00:34.740326 4802 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:34 crc kubenswrapper[4802]: E1201 20:00:34.740646 4802 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:34 crc kubenswrapper[4802]: I1201 20:00:34.740681 4802 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 01 20:00:34 crc kubenswrapper[4802]: E1201 20:00:34.741018 4802 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="200ms" Dec 01 20:00:34 crc kubenswrapper[4802]: E1201 20:00:34.942249 4802 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="400ms" Dec 01 20:00:35 crc kubenswrapper[4802]: E1201 20:00:35.343728 4802 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="800ms" Dec 01 20:00:36 crc kubenswrapper[4802]: E1201 20:00:36.091548 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T20:00:36Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T20:00:36Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T20:00:36Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T20:00:36Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:36 crc kubenswrapper[4802]: E1201 20:00:36.092261 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:36 crc kubenswrapper[4802]: E1201 20:00:36.092731 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:36 crc kubenswrapper[4802]: E1201 20:00:36.092957 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:36 crc kubenswrapper[4802]: E1201 20:00:36.093303 4802 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:36 crc kubenswrapper[4802]: E1201 20:00:36.093346 4802 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 20:00:36 crc kubenswrapper[4802]: E1201 20:00:36.144768 4802 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="1.6s" Dec 01 20:00:37 crc kubenswrapper[4802]: E1201 20:00:37.745766 4802 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="3.2s" Dec 01 20:00:38 crc kubenswrapper[4802]: I1201 20:00:38.724332 4802 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:38 crc kubenswrapper[4802]: I1201 20:00:38.724786 4802 status_manager.go:851] "Failed to get status for pod" podUID="b5cbd352-e2b4-4650-8265-e7b26b8890b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:39 crc kubenswrapper[4802]: I1201 20:00:39.719052 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 20:00:39 crc kubenswrapper[4802]: I1201 20:00:39.719892 4802 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:39 crc kubenswrapper[4802]: I1201 20:00:39.720571 4802 status_manager.go:851] "Failed to get status for pod" podUID="b5cbd352-e2b4-4650-8265-e7b26b8890b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:39 crc kubenswrapper[4802]: I1201 20:00:39.736820 4802 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868" Dec 01 20:00:39 crc kubenswrapper[4802]: I1201 20:00:39.736877 4802 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868" Dec 01 20:00:39 crc kubenswrapper[4802]: E1201 20:00:39.737590 4802 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 20:00:39 crc kubenswrapper[4802]: I1201 20:00:39.738551 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 20:00:40 crc kubenswrapper[4802]: I1201 20:00:40.609576 4802 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="fb6fde0943ed712f7c915c7f9489331cdaf09988f3e30d43f74e99676054d58c" exitCode=0 Dec 01 20:00:40 crc kubenswrapper[4802]: I1201 20:00:40.609716 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"fb6fde0943ed712f7c915c7f9489331cdaf09988f3e30d43f74e99676054d58c"} Dec 01 20:00:40 crc kubenswrapper[4802]: I1201 20:00:40.610019 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f5b2dd6ff17339a125a3bb2f988e12a3cb8411e3c2d7a1e33d3d436f5c1c2b73"} Dec 01 20:00:40 crc kubenswrapper[4802]: I1201 20:00:40.610491 4802 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868" Dec 01 20:00:40 crc kubenswrapper[4802]: I1201 20:00:40.610513 4802 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868" Dec 01 20:00:40 crc kubenswrapper[4802]: I1201 20:00:40.611154 4802 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:40 crc kubenswrapper[4802]: E1201 20:00:40.611159 4802 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 20:00:40 crc kubenswrapper[4802]: I1201 20:00:40.611878 4802 status_manager.go:851] "Failed to get status for pod" podUID="b5cbd352-e2b4-4650-8265-e7b26b8890b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 01 20:00:40 crc kubenswrapper[4802]: E1201 20:00:40.673009 4802 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.151:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187d2fd6f1b957c0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 20:00:29.22406496 +0000 UTC m=+250.786624651,LastTimestamp:2025-12-01 20:00:29.22406496 +0000 UTC m=+250.786624651,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 20:00:40 crc kubenswrapper[4802]: E1201 20:00:40.772413 4802 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.151:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" volumeName="registry-storage" Dec 01 20:00:40 crc kubenswrapper[4802]: E1201 20:00:40.947066 4802 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="6.4s" Dec 01 20:00:41 crc kubenswrapper[4802]: I1201 20:00:41.621243 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c59324a8fc5637e088c707e3db74f23abc3e032218a1d6fc5fc5e2b66cca1370"} Dec 01 20:00:41 crc kubenswrapper[4802]: I1201 20:00:41.621643 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"106957a69b52b4c8d3b23b8256ff06c4f26ef2f96ee6da4601be58abf8ff5d26"} Dec 01 20:00:41 crc kubenswrapper[4802]: I1201 20:00:41.621653 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"07c7d3fac34b63f6699a36bc92bdfb71e116d8bcfe6ed9e0b3e5347ec149d767"} Dec 01 20:00:41 crc kubenswrapper[4802]: I1201 20:00:41.621662 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"302201879b4904aead27cc2e2a5d02b9cc4305d7402ee7d8a4f5d6bff06400d2"} Dec 01 20:00:42 crc kubenswrapper[4802]: I1201 20:00:42.631458 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4235589861d6d43cc98dac6d5843a9aa405d9180293c30c9769ba7a00b4a36d5"} Dec 01 20:00:42 crc kubenswrapper[4802]: I1201 20:00:42.631623 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 20:00:42 crc kubenswrapper[4802]: I1201 20:00:42.631668 4802 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868" Dec 01 20:00:42 crc kubenswrapper[4802]: I1201 20:00:42.631693 4802 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868" Dec 01 20:00:42 crc kubenswrapper[4802]: I1201 20:00:42.635498 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 01 20:00:42 crc kubenswrapper[4802]: I1201 20:00:42.635588 4802 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8" exitCode=1 Dec 01 20:00:42 crc kubenswrapper[4802]: I1201 20:00:42.635628 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8"} Dec 01 20:00:42 crc kubenswrapper[4802]: I1201 20:00:42.636407 4802 scope.go:117] "RemoveContainer" containerID="7f2b61d320f79e913d2e3a2770b99da61d8e164466d95c7bafa7c489aa6feec8" Dec 01 20:00:43 crc kubenswrapper[4802]: I1201 20:00:43.646218 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 01 20:00:43 crc kubenswrapper[4802]: I1201 20:00:43.646754 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e131b5ae090ac52fb8def42844c97e9bb8c83d6c8ad90e73af031a4d33a1fcca"} Dec 01 20:00:44 crc kubenswrapper[4802]: I1201 20:00:44.737331 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 20:00:44 crc kubenswrapper[4802]: I1201 20:00:44.737488 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 20:00:44 crc kubenswrapper[4802]: I1201 20:00:44.737909 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 20:00:44 crc kubenswrapper[4802]: I1201 20:00:44.739012 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 20:00:44 crc kubenswrapper[4802]: I1201 20:00:44.739066 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 20:00:44 crc kubenswrapper[4802]: I1201 20:00:44.747129 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 20:00:46 crc kubenswrapper[4802]: I1201 20:00:46.698878 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 20:00:46 crc kubenswrapper[4802]: I1201 20:00:46.699314 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 20:00:46 crc kubenswrapper[4802]: I1201 20:00:46.701448 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 01 20:00:46 crc kubenswrapper[4802]: I1201 20:00:46.703360 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 01 20:00:46 crc kubenswrapper[4802]: I1201 20:00:46.710729 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 20:00:46 crc kubenswrapper[4802]: I1201 20:00:46.718313 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 20:00:46 crc kubenswrapper[4802]: I1201 20:00:46.800431 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 20:00:46 crc kubenswrapper[4802]: I1201 20:00:46.800556 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 20:00:46 crc kubenswrapper[4802]: I1201 20:00:46.802455 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 01 20:00:46 crc kubenswrapper[4802]: I1201 20:00:46.812694 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 01 20:00:46 crc kubenswrapper[4802]: I1201 20:00:46.828344 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 20:00:46 crc kubenswrapper[4802]: I1201 20:00:46.828413 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 20:00:46 crc kubenswrapper[4802]: I1201 20:00:46.845085 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 20:00:46 crc kubenswrapper[4802]: I1201 20:00:46.856262 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 20:00:47 crc kubenswrapper[4802]: I1201 20:00:47.051331 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 20:00:47 crc kubenswrapper[4802]: W1201 20:00:47.170585 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-eee31b9a5c70b11b2ec2ca55fad40742025c5519d87e04c3d2265e981ee739c3 WatchSource:0}: Error finding container eee31b9a5c70b11b2ec2ca55fad40742025c5519d87e04c3d2265e981ee739c3: Status 404 returned error can't find the container with id eee31b9a5c70b11b2ec2ca55fad40742025c5519d87e04c3d2265e981ee739c3 Dec 01 20:00:47 crc kubenswrapper[4802]: W1201 20:00:47.295713 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-abf92c425ef98d89d66d671acac1dfd007613f70afbf3baf74dbf03d83e4c0a3 WatchSource:0}: Error finding container abf92c425ef98d89d66d671acac1dfd007613f70afbf3baf74dbf03d83e4c0a3: Status 404 returned error can't find the container with id abf92c425ef98d89d66d671acac1dfd007613f70afbf3baf74dbf03d83e4c0a3 Dec 01 20:00:47 crc kubenswrapper[4802]: W1201 20:00:47.466388 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-fe18875d23c36fb5b240e6843aab73c5bf0346e6bd45424d757f4bfb096a22b9 WatchSource:0}: Error finding container fe18875d23c36fb5b240e6843aab73c5bf0346e6bd45424d757f4bfb096a22b9: Status 404 returned error can't find the container with id fe18875d23c36fb5b240e6843aab73c5bf0346e6bd45424d757f4bfb096a22b9 Dec 01 20:00:47 crc kubenswrapper[4802]: I1201 20:00:47.641495 4802 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 20:00:47 crc kubenswrapper[4802]: I1201 20:00:47.671358 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"d127cc6b9e7c9ae4cd15afe0f320ff748ea6376a90bf4d7eb6934e40a6777888"} Dec 01 20:00:47 crc kubenswrapper[4802]: I1201 20:00:47.671419 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"eee31b9a5c70b11b2ec2ca55fad40742025c5519d87e04c3d2265e981ee739c3"} Dec 01 20:00:47 crc kubenswrapper[4802]: I1201 20:00:47.672542 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2f0b4321b50b13c1c37d622ae404f61fb7bdb6555b6796d185f6f0269bbe6008"} Dec 01 20:00:47 crc kubenswrapper[4802]: I1201 20:00:47.672577 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"abf92c425ef98d89d66d671acac1dfd007613f70afbf3baf74dbf03d83e4c0a3"} Dec 01 20:00:47 crc kubenswrapper[4802]: I1201 20:00:47.673904 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"1e57b332446d9c3119521427b152e865d241a499a21aaab6d885332fc2fb98d9"} Dec 01 20:00:47 crc kubenswrapper[4802]: I1201 20:00:47.673956 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"fe18875d23c36fb5b240e6843aab73c5bf0346e6bd45424d757f4bfb096a22b9"} Dec 01 20:00:47 crc kubenswrapper[4802]: I1201 20:00:47.674297 4802 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868" Dec 01 20:00:47 crc kubenswrapper[4802]: I1201 20:00:47.674325 4802 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868" Dec 01 20:00:47 crc kubenswrapper[4802]: I1201 20:00:47.678348 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 20:00:48 crc kubenswrapper[4802]: I1201 20:00:48.681699 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Dec 01 20:00:48 crc kubenswrapper[4802]: I1201 20:00:48.681753 4802 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="2f0b4321b50b13c1c37d622ae404f61fb7bdb6555b6796d185f6f0269bbe6008" exitCode=255 Dec 01 20:00:48 crc kubenswrapper[4802]: I1201 20:00:48.681837 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"2f0b4321b50b13c1c37d622ae404f61fb7bdb6555b6796d185f6f0269bbe6008"} Dec 01 20:00:48 crc kubenswrapper[4802]: I1201 20:00:48.682065 4802 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868" Dec 01 20:00:48 crc kubenswrapper[4802]: I1201 20:00:48.682078 4802 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fbe7dc36-ec25-4beb-8d0d-8e40b9f4c868" Dec 01 20:00:48 crc kubenswrapper[4802]: I1201 20:00:48.682461 4802 scope.go:117] "RemoveContainer" containerID="2f0b4321b50b13c1c37d622ae404f61fb7bdb6555b6796d185f6f0269bbe6008" Dec 01 20:00:48 crc kubenswrapper[4802]: I1201 20:00:48.745592 4802 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="f2f5dd6c-5557-46dc-9a32-b17c9adb2f39" Dec 01 20:00:49 crc kubenswrapper[4802]: I1201 20:00:49.692028 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Dec 01 20:00:49 crc kubenswrapper[4802]: I1201 20:00:49.693379 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Dec 01 20:00:49 crc kubenswrapper[4802]: I1201 20:00:49.693442 4802 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="745bd60b57f0e819a964b6ea4167cf7f7f1a1172fa913918481aeedd15d44a1b" exitCode=255 Dec 01 20:00:49 crc kubenswrapper[4802]: I1201 20:00:49.693497 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"745bd60b57f0e819a964b6ea4167cf7f7f1a1172fa913918481aeedd15d44a1b"} Dec 01 20:00:49 crc kubenswrapper[4802]: I1201 20:00:49.693592 4802 scope.go:117] "RemoveContainer" containerID="2f0b4321b50b13c1c37d622ae404f61fb7bdb6555b6796d185f6f0269bbe6008" Dec 01 20:00:49 crc kubenswrapper[4802]: I1201 20:00:49.695116 4802 scope.go:117] "RemoveContainer" containerID="745bd60b57f0e819a964b6ea4167cf7f7f1a1172fa913918481aeedd15d44a1b" Dec 01 20:00:49 crc kubenswrapper[4802]: E1201 20:00:49.695695 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 20:00:50 crc kubenswrapper[4802]: I1201 20:00:50.703794 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Dec 01 20:00:54 crc kubenswrapper[4802]: I1201 20:00:54.363261 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 01 20:00:54 crc kubenswrapper[4802]: I1201 20:00:54.424423 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 01 20:00:54 crc kubenswrapper[4802]: I1201 20:00:54.483738 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 01 20:00:54 crc kubenswrapper[4802]: I1201 20:00:54.495916 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 01 20:00:55 crc kubenswrapper[4802]: I1201 20:00:55.250442 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 01 20:00:55 crc kubenswrapper[4802]: I1201 20:00:55.385300 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 01 20:00:55 crc kubenswrapper[4802]: I1201 20:00:55.915341 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 01 20:00:56 crc kubenswrapper[4802]: I1201 20:00:56.783660 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 20:00:56 crc kubenswrapper[4802]: I1201 20:00:56.995270 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 01 20:00:57 crc kubenswrapper[4802]: I1201 20:00:57.019675 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 01 20:00:57 crc kubenswrapper[4802]: I1201 20:00:57.052307 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 20:00:57 crc kubenswrapper[4802]: I1201 20:00:57.162132 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 01 20:00:57 crc kubenswrapper[4802]: I1201 20:00:57.848959 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 01 20:00:58 crc kubenswrapper[4802]: I1201 20:00:58.243519 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 01 20:00:58 crc kubenswrapper[4802]: I1201 20:00:58.293780 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 01 20:00:59 crc kubenswrapper[4802]: I1201 20:00:59.241998 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 01 20:00:59 crc kubenswrapper[4802]: I1201 20:00:59.553414 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 20:01:00 crc kubenswrapper[4802]: I1201 20:01:00.061188 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 01 20:01:00 crc kubenswrapper[4802]: I1201 20:01:00.225940 4802 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 01 20:01:00 crc kubenswrapper[4802]: I1201 20:01:00.228112 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 20:01:00 crc kubenswrapper[4802]: I1201 20:01:00.232162 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=32.232137609 podStartE2EDuration="32.232137609s" podCreationTimestamp="2025-12-01 20:00:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:00:47.650431181 +0000 UTC m=+269.212990822" watchObservedRunningTime="2025-12-01 20:01:00.232137609 +0000 UTC m=+281.794697260" Dec 01 20:01:00 crc kubenswrapper[4802]: I1201 20:01:00.233987 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 20:01:00 crc kubenswrapper[4802]: I1201 20:01:00.234040 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 20:01:00 crc kubenswrapper[4802]: I1201 20:01:00.238409 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 20:01:00 crc kubenswrapper[4802]: I1201 20:01:00.249324 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=13.249298081 podStartE2EDuration="13.249298081s" podCreationTimestamp="2025-12-01 20:00:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:01:00.249144406 +0000 UTC m=+281.811704047" watchObservedRunningTime="2025-12-01 20:01:00.249298081 +0000 UTC m=+281.811857722" Dec 01 20:01:00 crc kubenswrapper[4802]: I1201 20:01:00.669371 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 20:01:00 crc kubenswrapper[4802]: I1201 20:01:00.689286 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 01 20:01:01 crc kubenswrapper[4802]: I1201 20:01:01.047686 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 01 20:01:01 crc kubenswrapper[4802]: I1201 20:01:01.083183 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 01 20:01:01 crc kubenswrapper[4802]: I1201 20:01:01.109348 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 01 20:01:01 crc kubenswrapper[4802]: I1201 20:01:01.130298 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 01 20:01:01 crc kubenswrapper[4802]: I1201 20:01:01.156909 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 01 20:01:01 crc kubenswrapper[4802]: I1201 20:01:01.224439 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 01 20:01:01 crc kubenswrapper[4802]: I1201 20:01:01.225851 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 01 20:01:01 crc kubenswrapper[4802]: I1201 20:01:01.335075 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 01 20:01:01 crc kubenswrapper[4802]: I1201 20:01:01.392658 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 01 20:01:01 crc kubenswrapper[4802]: I1201 20:01:01.771642 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 01 20:01:01 crc kubenswrapper[4802]: I1201 20:01:01.784236 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 01 20:01:01 crc kubenswrapper[4802]: I1201 20:01:01.792224 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 20:01:01 crc kubenswrapper[4802]: I1201 20:01:01.915178 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 01 20:01:02 crc kubenswrapper[4802]: I1201 20:01:02.036564 4802 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 01 20:01:02 crc kubenswrapper[4802]: I1201 20:01:02.134062 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 01 20:01:02 crc kubenswrapper[4802]: I1201 20:01:02.188749 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 01 20:01:02 crc kubenswrapper[4802]: I1201 20:01:02.354179 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 01 20:01:02 crc kubenswrapper[4802]: I1201 20:01:02.438045 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 01 20:01:02 crc kubenswrapper[4802]: I1201 20:01:02.477659 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 01 20:01:02 crc kubenswrapper[4802]: I1201 20:01:02.482231 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 01 20:01:02 crc kubenswrapper[4802]: I1201 20:01:02.674037 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 01 20:01:02 crc kubenswrapper[4802]: I1201 20:01:02.719995 4802 scope.go:117] "RemoveContainer" containerID="745bd60b57f0e819a964b6ea4167cf7f7f1a1172fa913918481aeedd15d44a1b" Dec 01 20:01:02 crc kubenswrapper[4802]: I1201 20:01:02.794892 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 01 20:01:03 crc kubenswrapper[4802]: I1201 20:01:03.053797 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 20:01:03 crc kubenswrapper[4802]: I1201 20:01:03.415663 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 01 20:01:03 crc kubenswrapper[4802]: I1201 20:01:03.471089 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 01 20:01:03 crc kubenswrapper[4802]: I1201 20:01:03.634602 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 01 20:01:03 crc kubenswrapper[4802]: I1201 20:01:03.687836 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 01 20:01:03 crc kubenswrapper[4802]: I1201 20:01:03.690568 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 01 20:01:03 crc kubenswrapper[4802]: I1201 20:01:03.719869 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 20:01:03 crc kubenswrapper[4802]: I1201 20:01:03.795771 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Dec 01 20:01:03 crc kubenswrapper[4802]: I1201 20:01:03.795842 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8b3e41bc3bf5cfcbb0305133d61609c2127c1a45d413719c9e980af39db8b81e"} Dec 01 20:01:03 crc kubenswrapper[4802]: I1201 20:01:03.814895 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 01 20:01:03 crc kubenswrapper[4802]: I1201 20:01:03.925098 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 01 20:01:03 crc kubenswrapper[4802]: I1201 20:01:03.953440 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 01 20:01:04 crc kubenswrapper[4802]: I1201 20:01:04.046617 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 01 20:01:04 crc kubenswrapper[4802]: I1201 20:01:04.071951 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 20:01:04 crc kubenswrapper[4802]: I1201 20:01:04.091592 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 01 20:01:04 crc kubenswrapper[4802]: I1201 20:01:04.099804 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 01 20:01:04 crc kubenswrapper[4802]: I1201 20:01:04.176779 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 20:01:04 crc kubenswrapper[4802]: I1201 20:01:04.209165 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 01 20:01:04 crc kubenswrapper[4802]: I1201 20:01:04.347420 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 01 20:01:04 crc kubenswrapper[4802]: I1201 20:01:04.528304 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 01 20:01:04 crc kubenswrapper[4802]: I1201 20:01:04.601575 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 01 20:01:04 crc kubenswrapper[4802]: I1201 20:01:04.707685 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 01 20:01:04 crc kubenswrapper[4802]: I1201 20:01:04.744544 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 01 20:01:04 crc kubenswrapper[4802]: I1201 20:01:04.748584 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 01 20:01:04 crc kubenswrapper[4802]: I1201 20:01:04.753475 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 01 20:01:04 crc kubenswrapper[4802]: I1201 20:01:04.837972 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 01 20:01:04 crc kubenswrapper[4802]: I1201 20:01:04.845208 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 01 20:01:04 crc kubenswrapper[4802]: I1201 20:01:04.859654 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 01 20:01:04 crc kubenswrapper[4802]: I1201 20:01:04.870728 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 01 20:01:04 crc kubenswrapper[4802]: I1201 20:01:04.931953 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 01 20:01:04 crc kubenswrapper[4802]: I1201 20:01:04.986727 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 20:01:05 crc kubenswrapper[4802]: I1201 20:01:05.064359 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 01 20:01:05 crc kubenswrapper[4802]: I1201 20:01:05.079653 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 01 20:01:05 crc kubenswrapper[4802]: I1201 20:01:05.116680 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 01 20:01:05 crc kubenswrapper[4802]: I1201 20:01:05.228658 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 01 20:01:05 crc kubenswrapper[4802]: I1201 20:01:05.249095 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 01 20:01:05 crc kubenswrapper[4802]: I1201 20:01:05.385651 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 01 20:01:05 crc kubenswrapper[4802]: I1201 20:01:05.423471 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 01 20:01:05 crc kubenswrapper[4802]: I1201 20:01:05.432946 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 01 20:01:05 crc kubenswrapper[4802]: I1201 20:01:05.434552 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 01 20:01:05 crc kubenswrapper[4802]: I1201 20:01:05.651083 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 01 20:01:05 crc kubenswrapper[4802]: I1201 20:01:05.695154 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 01 20:01:05 crc kubenswrapper[4802]: I1201 20:01:05.711265 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 01 20:01:05 crc kubenswrapper[4802]: I1201 20:01:05.782917 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 01 20:01:05 crc kubenswrapper[4802]: I1201 20:01:05.801945 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 01 20:01:05 crc kubenswrapper[4802]: I1201 20:01:05.803287 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 01 20:01:05 crc kubenswrapper[4802]: I1201 20:01:05.871574 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 01 20:01:05 crc kubenswrapper[4802]: I1201 20:01:05.907827 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 01 20:01:05 crc kubenswrapper[4802]: I1201 20:01:05.912545 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 20:01:05 crc kubenswrapper[4802]: I1201 20:01:05.994576 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 01 20:01:06 crc kubenswrapper[4802]: I1201 20:01:06.017152 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 01 20:01:06 crc kubenswrapper[4802]: I1201 20:01:06.121915 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 01 20:01:06 crc kubenswrapper[4802]: I1201 20:01:06.152677 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 01 20:01:06 crc kubenswrapper[4802]: I1201 20:01:06.226386 4802 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 01 20:01:06 crc kubenswrapper[4802]: I1201 20:01:06.227476 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 01 20:01:06 crc kubenswrapper[4802]: I1201 20:01:06.279925 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 01 20:01:06 crc kubenswrapper[4802]: I1201 20:01:06.304932 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 20:01:06 crc kubenswrapper[4802]: I1201 20:01:06.317774 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 01 20:01:06 crc kubenswrapper[4802]: I1201 20:01:06.421539 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 01 20:01:06 crc kubenswrapper[4802]: I1201 20:01:06.429113 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 01 20:01:06 crc kubenswrapper[4802]: I1201 20:01:06.442514 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 01 20:01:06 crc kubenswrapper[4802]: I1201 20:01:06.474158 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 01 20:01:06 crc kubenswrapper[4802]: I1201 20:01:06.658591 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 01 20:01:06 crc kubenswrapper[4802]: I1201 20:01:06.669954 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 01 20:01:06 crc kubenswrapper[4802]: I1201 20:01:06.793647 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 01 20:01:06 crc kubenswrapper[4802]: I1201 20:01:06.883869 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 01 20:01:07 crc kubenswrapper[4802]: I1201 20:01:07.011743 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 01 20:01:07 crc kubenswrapper[4802]: I1201 20:01:07.053269 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 01 20:01:07 crc kubenswrapper[4802]: I1201 20:01:07.053388 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 01 20:01:07 crc kubenswrapper[4802]: I1201 20:01:07.077451 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 01 20:01:07 crc kubenswrapper[4802]: I1201 20:01:07.105357 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 01 20:01:07 crc kubenswrapper[4802]: I1201 20:01:07.238482 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 01 20:01:07 crc kubenswrapper[4802]: I1201 20:01:07.345856 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 01 20:01:07 crc kubenswrapper[4802]: I1201 20:01:07.387593 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 01 20:01:07 crc kubenswrapper[4802]: I1201 20:01:07.433177 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 01 20:01:07 crc kubenswrapper[4802]: I1201 20:01:07.493056 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 01 20:01:07 crc kubenswrapper[4802]: I1201 20:01:07.502370 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 01 20:01:07 crc kubenswrapper[4802]: I1201 20:01:07.513827 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 01 20:01:07 crc kubenswrapper[4802]: I1201 20:01:07.556097 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 01 20:01:07 crc kubenswrapper[4802]: I1201 20:01:07.590543 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 01 20:01:07 crc kubenswrapper[4802]: I1201 20:01:07.620717 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 01 20:01:07 crc kubenswrapper[4802]: I1201 20:01:07.621963 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 01 20:01:07 crc kubenswrapper[4802]: I1201 20:01:07.631015 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 20:01:07 crc kubenswrapper[4802]: I1201 20:01:07.657094 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 01 20:01:07 crc kubenswrapper[4802]: I1201 20:01:07.704736 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 20:01:07 crc kubenswrapper[4802]: I1201 20:01:07.795950 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 20:01:07 crc kubenswrapper[4802]: I1201 20:01:07.816231 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 01 20:01:07 crc kubenswrapper[4802]: I1201 20:01:07.913749 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 01 20:01:07 crc kubenswrapper[4802]: I1201 20:01:07.984134 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 01 20:01:07 crc kubenswrapper[4802]: I1201 20:01:07.990632 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 01 20:01:08 crc kubenswrapper[4802]: I1201 20:01:08.018974 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 01 20:01:08 crc kubenswrapper[4802]: I1201 20:01:08.118239 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 01 20:01:08 crc kubenswrapper[4802]: I1201 20:01:08.231777 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 01 20:01:08 crc kubenswrapper[4802]: I1201 20:01:08.328302 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 01 20:01:08 crc kubenswrapper[4802]: I1201 20:01:08.357240 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 01 20:01:08 crc kubenswrapper[4802]: I1201 20:01:08.417834 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 01 20:01:08 crc kubenswrapper[4802]: I1201 20:01:08.473653 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 01 20:01:08 crc kubenswrapper[4802]: I1201 20:01:08.495556 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 01 20:01:08 crc kubenswrapper[4802]: I1201 20:01:08.545859 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 01 20:01:08 crc kubenswrapper[4802]: I1201 20:01:08.590297 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 01 20:01:08 crc kubenswrapper[4802]: I1201 20:01:08.827730 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 01 20:01:08 crc kubenswrapper[4802]: I1201 20:01:08.828325 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 01 20:01:08 crc kubenswrapper[4802]: I1201 20:01:08.845547 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 01 20:01:08 crc kubenswrapper[4802]: I1201 20:01:08.923742 4802 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 01 20:01:08 crc kubenswrapper[4802]: I1201 20:01:08.938850 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 01 20:01:09 crc kubenswrapper[4802]: I1201 20:01:09.039627 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 01 20:01:09 crc kubenswrapper[4802]: I1201 20:01:09.057789 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 01 20:01:09 crc kubenswrapper[4802]: I1201 20:01:09.142171 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 01 20:01:09 crc kubenswrapper[4802]: I1201 20:01:09.154549 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 01 20:01:09 crc kubenswrapper[4802]: I1201 20:01:09.211189 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 01 20:01:09 crc kubenswrapper[4802]: I1201 20:01:09.267762 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 20:01:09 crc kubenswrapper[4802]: I1201 20:01:09.300792 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 01 20:01:09 crc kubenswrapper[4802]: I1201 20:01:09.334708 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 01 20:01:09 crc kubenswrapper[4802]: I1201 20:01:09.349004 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 01 20:01:09 crc kubenswrapper[4802]: I1201 20:01:09.358983 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 01 20:01:09 crc kubenswrapper[4802]: I1201 20:01:09.400399 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 01 20:01:09 crc kubenswrapper[4802]: I1201 20:01:09.410277 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 01 20:01:09 crc kubenswrapper[4802]: I1201 20:01:09.511311 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 01 20:01:09 crc kubenswrapper[4802]: I1201 20:01:09.512601 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 01 20:01:09 crc kubenswrapper[4802]: I1201 20:01:09.515540 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 01 20:01:09 crc kubenswrapper[4802]: I1201 20:01:09.549806 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 01 20:01:09 crc kubenswrapper[4802]: I1201 20:01:09.609563 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 01 20:01:09 crc kubenswrapper[4802]: I1201 20:01:09.634993 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 01 20:01:09 crc kubenswrapper[4802]: I1201 20:01:09.651838 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 01 20:01:09 crc kubenswrapper[4802]: I1201 20:01:09.761070 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 01 20:01:09 crc kubenswrapper[4802]: I1201 20:01:09.795153 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 01 20:01:09 crc kubenswrapper[4802]: I1201 20:01:09.806702 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 01 20:01:10 crc kubenswrapper[4802]: I1201 20:01:10.013691 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 01 20:01:10 crc kubenswrapper[4802]: I1201 20:01:10.098999 4802 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 20:01:10 crc kubenswrapper[4802]: I1201 20:01:10.099432 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://ab074fe5550cfcf629503b4c2b29a2ac6d48f7b42a91498f2480f174a7844bf9" gracePeriod=5 Dec 01 20:01:10 crc kubenswrapper[4802]: I1201 20:01:10.113295 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 01 20:01:10 crc kubenswrapper[4802]: I1201 20:01:10.282365 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 20:01:10 crc kubenswrapper[4802]: I1201 20:01:10.298106 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 01 20:01:10 crc kubenswrapper[4802]: I1201 20:01:10.325519 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 01 20:01:10 crc kubenswrapper[4802]: I1201 20:01:10.326286 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 01 20:01:10 crc kubenswrapper[4802]: I1201 20:01:10.395994 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 01 20:01:10 crc kubenswrapper[4802]: I1201 20:01:10.444174 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 01 20:01:10 crc kubenswrapper[4802]: I1201 20:01:10.452222 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 01 20:01:10 crc kubenswrapper[4802]: I1201 20:01:10.552794 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 01 20:01:10 crc kubenswrapper[4802]: I1201 20:01:10.568619 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 01 20:01:10 crc kubenswrapper[4802]: I1201 20:01:10.694444 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 01 20:01:10 crc kubenswrapper[4802]: I1201 20:01:10.717734 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 01 20:01:10 crc kubenswrapper[4802]: I1201 20:01:10.742792 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 01 20:01:10 crc kubenswrapper[4802]: I1201 20:01:10.841305 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 01 20:01:10 crc kubenswrapper[4802]: I1201 20:01:10.969331 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 01 20:01:10 crc kubenswrapper[4802]: I1201 20:01:10.976551 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 01 20:01:11 crc kubenswrapper[4802]: I1201 20:01:11.082513 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 01 20:01:11 crc kubenswrapper[4802]: I1201 20:01:11.110276 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 01 20:01:11 crc kubenswrapper[4802]: I1201 20:01:11.158589 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 01 20:01:11 crc kubenswrapper[4802]: I1201 20:01:11.198437 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 01 20:01:11 crc kubenswrapper[4802]: I1201 20:01:11.271121 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 01 20:01:11 crc kubenswrapper[4802]: I1201 20:01:11.295217 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 01 20:01:11 crc kubenswrapper[4802]: I1201 20:01:11.373571 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 01 20:01:11 crc kubenswrapper[4802]: I1201 20:01:11.405043 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 01 20:01:11 crc kubenswrapper[4802]: I1201 20:01:11.453267 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 01 20:01:11 crc kubenswrapper[4802]: I1201 20:01:11.528877 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 01 20:01:11 crc kubenswrapper[4802]: I1201 20:01:11.653624 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 01 20:01:11 crc kubenswrapper[4802]: I1201 20:01:11.729517 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 01 20:01:11 crc kubenswrapper[4802]: I1201 20:01:11.800337 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 01 20:01:11 crc kubenswrapper[4802]: I1201 20:01:11.829911 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 01 20:01:11 crc kubenswrapper[4802]: I1201 20:01:11.840994 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 01 20:01:11 crc kubenswrapper[4802]: I1201 20:01:11.863640 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 01 20:01:11 crc kubenswrapper[4802]: I1201 20:01:11.879385 4802 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 01 20:01:11 crc kubenswrapper[4802]: I1201 20:01:11.883520 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 01 20:01:11 crc kubenswrapper[4802]: I1201 20:01:11.985837 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 01 20:01:12 crc kubenswrapper[4802]: I1201 20:01:12.040982 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 01 20:01:12 crc kubenswrapper[4802]: I1201 20:01:12.045776 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 01 20:01:12 crc kubenswrapper[4802]: I1201 20:01:12.072944 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 01 20:01:12 crc kubenswrapper[4802]: I1201 20:01:12.113540 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 01 20:01:12 crc kubenswrapper[4802]: I1201 20:01:12.149790 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 01 20:01:12 crc kubenswrapper[4802]: I1201 20:01:12.316206 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 01 20:01:12 crc kubenswrapper[4802]: I1201 20:01:12.328407 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 01 20:01:12 crc kubenswrapper[4802]: I1201 20:01:12.360516 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 01 20:01:12 crc kubenswrapper[4802]: I1201 20:01:12.435354 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 01 20:01:12 crc kubenswrapper[4802]: I1201 20:01:12.541694 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 01 20:01:12 crc kubenswrapper[4802]: I1201 20:01:12.792381 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 01 20:01:12 crc kubenswrapper[4802]: I1201 20:01:12.810249 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 01 20:01:12 crc kubenswrapper[4802]: I1201 20:01:12.933679 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 01 20:01:13 crc kubenswrapper[4802]: I1201 20:01:13.021322 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 01 20:01:13 crc kubenswrapper[4802]: I1201 20:01:13.129496 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 01 20:01:13 crc kubenswrapper[4802]: I1201 20:01:13.205140 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 01 20:01:13 crc kubenswrapper[4802]: I1201 20:01:13.411576 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 01 20:01:13 crc kubenswrapper[4802]: I1201 20:01:13.503121 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 01 20:01:13 crc kubenswrapper[4802]: I1201 20:01:13.526217 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 01 20:01:13 crc kubenswrapper[4802]: I1201 20:01:13.619434 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 01 20:01:13 crc kubenswrapper[4802]: I1201 20:01:13.669901 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 01 20:01:13 crc kubenswrapper[4802]: I1201 20:01:13.760765 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 01 20:01:14 crc kubenswrapper[4802]: I1201 20:01:14.127843 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 01 20:01:14 crc kubenswrapper[4802]: I1201 20:01:14.134578 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 01 20:01:14 crc kubenswrapper[4802]: I1201 20:01:14.236498 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 01 20:01:14 crc kubenswrapper[4802]: I1201 20:01:14.252593 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 01 20:01:14 crc kubenswrapper[4802]: I1201 20:01:14.276827 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 01 20:01:14 crc kubenswrapper[4802]: I1201 20:01:14.360005 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 01 20:01:14 crc kubenswrapper[4802]: I1201 20:01:14.401838 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 01 20:01:14 crc kubenswrapper[4802]: I1201 20:01:14.575010 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 01 20:01:14 crc kubenswrapper[4802]: I1201 20:01:14.597667 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 01 20:01:14 crc kubenswrapper[4802]: I1201 20:01:14.696702 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 01 20:01:14 crc kubenswrapper[4802]: I1201 20:01:14.855472 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 01 20:01:14 crc kubenswrapper[4802]: I1201 20:01:14.998687 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 01 20:01:15 crc kubenswrapper[4802]: I1201 20:01:15.448572 4802 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 01 20:01:15 crc kubenswrapper[4802]: I1201 20:01:15.629267 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 01 20:01:15 crc kubenswrapper[4802]: I1201 20:01:15.668350 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 01 20:01:15 crc kubenswrapper[4802]: I1201 20:01:15.668792 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 01 20:01:15 crc kubenswrapper[4802]: I1201 20:01:15.668906 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 20:01:15 crc kubenswrapper[4802]: I1201 20:01:15.674023 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 01 20:01:15 crc kubenswrapper[4802]: I1201 20:01:15.764576 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 20:01:15 crc kubenswrapper[4802]: I1201 20:01:15.764614 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 20:01:15 crc kubenswrapper[4802]: I1201 20:01:15.764677 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 20:01:15 crc kubenswrapper[4802]: I1201 20:01:15.764704 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:01:15 crc kubenswrapper[4802]: I1201 20:01:15.764774 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:01:15 crc kubenswrapper[4802]: I1201 20:01:15.764836 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 20:01:15 crc kubenswrapper[4802]: I1201 20:01:15.764845 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:01:15 crc kubenswrapper[4802]: I1201 20:01:15.764883 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 20:01:15 crc kubenswrapper[4802]: I1201 20:01:15.764972 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:01:15 crc kubenswrapper[4802]: I1201 20:01:15.765085 4802 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:15 crc kubenswrapper[4802]: I1201 20:01:15.765097 4802 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:15 crc kubenswrapper[4802]: I1201 20:01:15.765107 4802 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:15 crc kubenswrapper[4802]: I1201 20:01:15.765114 4802 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:15 crc kubenswrapper[4802]: I1201 20:01:15.773456 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:01:15 crc kubenswrapper[4802]: I1201 20:01:15.866152 4802 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:15 crc kubenswrapper[4802]: I1201 20:01:15.868455 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 01 20:01:15 crc kubenswrapper[4802]: I1201 20:01:15.868503 4802 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="ab074fe5550cfcf629503b4c2b29a2ac6d48f7b42a91498f2480f174a7844bf9" exitCode=137 Dec 01 20:01:15 crc kubenswrapper[4802]: I1201 20:01:15.868559 4802 scope.go:117] "RemoveContainer" containerID="ab074fe5550cfcf629503b4c2b29a2ac6d48f7b42a91498f2480f174a7844bf9" Dec 01 20:01:15 crc kubenswrapper[4802]: I1201 20:01:15.868606 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 20:01:15 crc kubenswrapper[4802]: I1201 20:01:15.882996 4802 scope.go:117] "RemoveContainer" containerID="ab074fe5550cfcf629503b4c2b29a2ac6d48f7b42a91498f2480f174a7844bf9" Dec 01 20:01:15 crc kubenswrapper[4802]: E1201 20:01:15.883454 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab074fe5550cfcf629503b4c2b29a2ac6d48f7b42a91498f2480f174a7844bf9\": container with ID starting with ab074fe5550cfcf629503b4c2b29a2ac6d48f7b42a91498f2480f174a7844bf9 not found: ID does not exist" containerID="ab074fe5550cfcf629503b4c2b29a2ac6d48f7b42a91498f2480f174a7844bf9" Dec 01 20:01:15 crc kubenswrapper[4802]: I1201 20:01:15.883501 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab074fe5550cfcf629503b4c2b29a2ac6d48f7b42a91498f2480f174a7844bf9"} err="failed to get container status \"ab074fe5550cfcf629503b4c2b29a2ac6d48f7b42a91498f2480f174a7844bf9\": rpc error: code = NotFound desc = could not find container \"ab074fe5550cfcf629503b4c2b29a2ac6d48f7b42a91498f2480f174a7844bf9\": container with ID starting with ab074fe5550cfcf629503b4c2b29a2ac6d48f7b42a91498f2480f174a7844bf9 not found: ID does not exist" Dec 01 20:01:16 crc kubenswrapper[4802]: I1201 20:01:16.001424 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 01 20:01:16 crc kubenswrapper[4802]: I1201 20:01:16.040328 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 20:01:16 crc kubenswrapper[4802]: I1201 20:01:16.579085 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 01 20:01:16 crc kubenswrapper[4802]: I1201 20:01:16.726708 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 01 20:01:16 crc kubenswrapper[4802]: I1201 20:01:16.727154 4802 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 01 20:01:16 crc kubenswrapper[4802]: I1201 20:01:16.737860 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 20:01:16 crc kubenswrapper[4802]: I1201 20:01:16.737908 4802 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="c2c1325d-d0bd-4839-99cf-42b20e8a5d17" Dec 01 20:01:16 crc kubenswrapper[4802]: I1201 20:01:16.743755 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 20:01:16 crc kubenswrapper[4802]: I1201 20:01:16.743794 4802 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="c2c1325d-d0bd-4839-99cf-42b20e8a5d17" Dec 01 20:01:17 crc kubenswrapper[4802]: I1201 20:01:17.026914 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 01 20:01:17 crc kubenswrapper[4802]: I1201 20:01:17.055335 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 20:01:17 crc kubenswrapper[4802]: I1201 20:01:17.081904 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 01 20:01:17 crc kubenswrapper[4802]: I1201 20:01:17.473861 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 01 20:01:18 crc kubenswrapper[4802]: I1201 20:01:18.408175 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 01 20:01:33 crc kubenswrapper[4802]: I1201 20:01:33.622163 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hv2h9"] Dec 01 20:01:33 crc kubenswrapper[4802]: I1201 20:01:33.622892 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-hv2h9" podUID="6b139dad-bbb0-4d0f-bd11-14f142ef1767" containerName="controller-manager" containerID="cri-o://02518d33f19f286d1413802aa22f08506e917a85a83e136bfdf024985b79ae6d" gracePeriod=30 Dec 01 20:01:33 crc kubenswrapper[4802]: I1201 20:01:33.727583 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm"] Dec 01 20:01:33 crc kubenswrapper[4802]: I1201 20:01:33.728449 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm" podUID="56c261dd-49c4-4f69-9400-7a012e281b7b" containerName="route-controller-manager" containerID="cri-o://99717dba24628eb04984c7e6568fed381d6a91f494eb675aaef265a7e6e83292" gracePeriod=30 Dec 01 20:01:33 crc kubenswrapper[4802]: I1201 20:01:33.959647 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hv2h9" Dec 01 20:01:33 crc kubenswrapper[4802]: I1201 20:01:33.980919 4802 generic.go:334] "Generic (PLEG): container finished" podID="56c261dd-49c4-4f69-9400-7a012e281b7b" containerID="99717dba24628eb04984c7e6568fed381d6a91f494eb675aaef265a7e6e83292" exitCode=0 Dec 01 20:01:33 crc kubenswrapper[4802]: I1201 20:01:33.980976 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm" event={"ID":"56c261dd-49c4-4f69-9400-7a012e281b7b","Type":"ContainerDied","Data":"99717dba24628eb04984c7e6568fed381d6a91f494eb675aaef265a7e6e83292"} Dec 01 20:01:33 crc kubenswrapper[4802]: I1201 20:01:33.982411 4802 generic.go:334] "Generic (PLEG): container finished" podID="6b139dad-bbb0-4d0f-bd11-14f142ef1767" containerID="02518d33f19f286d1413802aa22f08506e917a85a83e136bfdf024985b79ae6d" exitCode=0 Dec 01 20:01:33 crc kubenswrapper[4802]: I1201 20:01:33.982446 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hv2h9" event={"ID":"6b139dad-bbb0-4d0f-bd11-14f142ef1767","Type":"ContainerDied","Data":"02518d33f19f286d1413802aa22f08506e917a85a83e136bfdf024985b79ae6d"} Dec 01 20:01:33 crc kubenswrapper[4802]: I1201 20:01:33.982465 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hv2h9" event={"ID":"6b139dad-bbb0-4d0f-bd11-14f142ef1767","Type":"ContainerDied","Data":"f1d4bb3293dee0da48beccae98e4ccad27878e2d9e5ca330618b95a1cab82856"} Dec 01 20:01:33 crc kubenswrapper[4802]: I1201 20:01:33.982484 4802 scope.go:117] "RemoveContainer" containerID="02518d33f19f286d1413802aa22f08506e917a85a83e136bfdf024985b79ae6d" Dec 01 20:01:33 crc kubenswrapper[4802]: I1201 20:01:33.982614 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hv2h9" Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.006786 4802 scope.go:117] "RemoveContainer" containerID="02518d33f19f286d1413802aa22f08506e917a85a83e136bfdf024985b79ae6d" Dec 01 20:01:34 crc kubenswrapper[4802]: E1201 20:01:34.007300 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02518d33f19f286d1413802aa22f08506e917a85a83e136bfdf024985b79ae6d\": container with ID starting with 02518d33f19f286d1413802aa22f08506e917a85a83e136bfdf024985b79ae6d not found: ID does not exist" containerID="02518d33f19f286d1413802aa22f08506e917a85a83e136bfdf024985b79ae6d" Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.007345 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02518d33f19f286d1413802aa22f08506e917a85a83e136bfdf024985b79ae6d"} err="failed to get container status \"02518d33f19f286d1413802aa22f08506e917a85a83e136bfdf024985b79ae6d\": rpc error: code = NotFound desc = could not find container \"02518d33f19f286d1413802aa22f08506e917a85a83e136bfdf024985b79ae6d\": container with ID starting with 02518d33f19f286d1413802aa22f08506e917a85a83e136bfdf024985b79ae6d not found: ID does not exist" Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.019107 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm" Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.108795 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b139dad-bbb0-4d0f-bd11-14f142ef1767-client-ca\") pod \"6b139dad-bbb0-4d0f-bd11-14f142ef1767\" (UID: \"6b139dad-bbb0-4d0f-bd11-14f142ef1767\") " Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.109242 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b139dad-bbb0-4d0f-bd11-14f142ef1767-serving-cert\") pod \"6b139dad-bbb0-4d0f-bd11-14f142ef1767\" (UID: \"6b139dad-bbb0-4d0f-bd11-14f142ef1767\") " Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.109347 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6b139dad-bbb0-4d0f-bd11-14f142ef1767-proxy-ca-bundles\") pod \"6b139dad-bbb0-4d0f-bd11-14f142ef1767\" (UID: \"6b139dad-bbb0-4d0f-bd11-14f142ef1767\") " Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.109400 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b139dad-bbb0-4d0f-bd11-14f142ef1767-config\") pod \"6b139dad-bbb0-4d0f-bd11-14f142ef1767\" (UID: \"6b139dad-bbb0-4d0f-bd11-14f142ef1767\") " Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.109421 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm622\" (UniqueName: \"kubernetes.io/projected/6b139dad-bbb0-4d0f-bd11-14f142ef1767-kube-api-access-gm622\") pod \"6b139dad-bbb0-4d0f-bd11-14f142ef1767\" (UID: \"6b139dad-bbb0-4d0f-bd11-14f142ef1767\") " Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.109534 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b139dad-bbb0-4d0f-bd11-14f142ef1767-client-ca" (OuterVolumeSpecName: "client-ca") pod "6b139dad-bbb0-4d0f-bd11-14f142ef1767" (UID: "6b139dad-bbb0-4d0f-bd11-14f142ef1767"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.110013 4802 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b139dad-bbb0-4d0f-bd11-14f142ef1767-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.110088 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b139dad-bbb0-4d0f-bd11-14f142ef1767-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6b139dad-bbb0-4d0f-bd11-14f142ef1767" (UID: "6b139dad-bbb0-4d0f-bd11-14f142ef1767"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.110371 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b139dad-bbb0-4d0f-bd11-14f142ef1767-config" (OuterVolumeSpecName: "config") pod "6b139dad-bbb0-4d0f-bd11-14f142ef1767" (UID: "6b139dad-bbb0-4d0f-bd11-14f142ef1767"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.115133 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b139dad-bbb0-4d0f-bd11-14f142ef1767-kube-api-access-gm622" (OuterVolumeSpecName: "kube-api-access-gm622") pod "6b139dad-bbb0-4d0f-bd11-14f142ef1767" (UID: "6b139dad-bbb0-4d0f-bd11-14f142ef1767"). InnerVolumeSpecName "kube-api-access-gm622". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.115436 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b139dad-bbb0-4d0f-bd11-14f142ef1767-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6b139dad-bbb0-4d0f-bd11-14f142ef1767" (UID: "6b139dad-bbb0-4d0f-bd11-14f142ef1767"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.211090 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56c261dd-49c4-4f69-9400-7a012e281b7b-serving-cert\") pod \"56c261dd-49c4-4f69-9400-7a012e281b7b\" (UID: \"56c261dd-49c4-4f69-9400-7a012e281b7b\") " Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.211162 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/56c261dd-49c4-4f69-9400-7a012e281b7b-client-ca\") pod \"56c261dd-49c4-4f69-9400-7a012e281b7b\" (UID: \"56c261dd-49c4-4f69-9400-7a012e281b7b\") " Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.211245 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56c261dd-49c4-4f69-9400-7a012e281b7b-config\") pod \"56c261dd-49c4-4f69-9400-7a012e281b7b\" (UID: \"56c261dd-49c4-4f69-9400-7a012e281b7b\") " Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.211347 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctrj4\" (UniqueName: \"kubernetes.io/projected/56c261dd-49c4-4f69-9400-7a012e281b7b-kube-api-access-ctrj4\") pod \"56c261dd-49c4-4f69-9400-7a012e281b7b\" (UID: \"56c261dd-49c4-4f69-9400-7a012e281b7b\") " Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.211573 4802 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6b139dad-bbb0-4d0f-bd11-14f142ef1767-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.211596 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b139dad-bbb0-4d0f-bd11-14f142ef1767-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.211608 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm622\" (UniqueName: \"kubernetes.io/projected/6b139dad-bbb0-4d0f-bd11-14f142ef1767-kube-api-access-gm622\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.211622 4802 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b139dad-bbb0-4d0f-bd11-14f142ef1767-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.212502 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56c261dd-49c4-4f69-9400-7a012e281b7b-client-ca" (OuterVolumeSpecName: "client-ca") pod "56c261dd-49c4-4f69-9400-7a012e281b7b" (UID: "56c261dd-49c4-4f69-9400-7a012e281b7b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.212560 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56c261dd-49c4-4f69-9400-7a012e281b7b-config" (OuterVolumeSpecName: "config") pod "56c261dd-49c4-4f69-9400-7a012e281b7b" (UID: "56c261dd-49c4-4f69-9400-7a012e281b7b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.215391 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56c261dd-49c4-4f69-9400-7a012e281b7b-kube-api-access-ctrj4" (OuterVolumeSpecName: "kube-api-access-ctrj4") pod "56c261dd-49c4-4f69-9400-7a012e281b7b" (UID: "56c261dd-49c4-4f69-9400-7a012e281b7b"). InnerVolumeSpecName "kube-api-access-ctrj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.215423 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c261dd-49c4-4f69-9400-7a012e281b7b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "56c261dd-49c4-4f69-9400-7a012e281b7b" (UID: "56c261dd-49c4-4f69-9400-7a012e281b7b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.309074 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hv2h9"] Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.312394 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hv2h9"] Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.312821 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56c261dd-49c4-4f69-9400-7a012e281b7b-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.312884 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctrj4\" (UniqueName: \"kubernetes.io/projected/56c261dd-49c4-4f69-9400-7a012e281b7b-kube-api-access-ctrj4\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.312914 4802 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56c261dd-49c4-4f69-9400-7a012e281b7b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.312929 4802 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/56c261dd-49c4-4f69-9400-7a012e281b7b-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.728469 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b139dad-bbb0-4d0f-bd11-14f142ef1767" path="/var/lib/kubelet/pods/6b139dad-bbb0-4d0f-bd11-14f142ef1767/volumes" Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.992375 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm" event={"ID":"56c261dd-49c4-4f69-9400-7a012e281b7b","Type":"ContainerDied","Data":"db33ca0ca1be15ce759f7d676072688a219b463d93e03cc53afd0fd950f62d32"} Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.992808 4802 scope.go:117] "RemoveContainer" containerID="99717dba24628eb04984c7e6568fed381d6a91f494eb675aaef265a7e6e83292" Dec 01 20:01:34 crc kubenswrapper[4802]: I1201 20:01:34.992436 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.015231 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm"] Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.018595 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qr5sm"] Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.197656 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78948969fd-2cvpt"] Dec 01 20:01:35 crc kubenswrapper[4802]: E1201 20:01:35.197903 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.197916 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 20:01:35 crc kubenswrapper[4802]: E1201 20:01:35.197924 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b139dad-bbb0-4d0f-bd11-14f142ef1767" containerName="controller-manager" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.197932 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b139dad-bbb0-4d0f-bd11-14f142ef1767" containerName="controller-manager" Dec 01 20:01:35 crc kubenswrapper[4802]: E1201 20:01:35.197945 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c261dd-49c4-4f69-9400-7a012e281b7b" containerName="route-controller-manager" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.197952 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c261dd-49c4-4f69-9400-7a012e281b7b" containerName="route-controller-manager" Dec 01 20:01:35 crc kubenswrapper[4802]: E1201 20:01:35.197963 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5cbd352-e2b4-4650-8265-e7b26b8890b4" containerName="installer" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.197968 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5cbd352-e2b4-4650-8265-e7b26b8890b4" containerName="installer" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.198051 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b139dad-bbb0-4d0f-bd11-14f142ef1767" containerName="controller-manager" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.198059 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5cbd352-e2b4-4650-8265-e7b26b8890b4" containerName="installer" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.198072 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="56c261dd-49c4-4f69-9400-7a012e281b7b" containerName="route-controller-manager" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.198082 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.198479 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78948969fd-2cvpt" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.200774 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.200790 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6884cfc866-4rqsh"] Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.200991 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.201473 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6884cfc866-4rqsh" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.201834 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.201990 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.203425 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.203431 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.203836 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.203968 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.204482 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.204622 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.204902 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.205572 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.218661 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.220155 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6884cfc866-4rqsh"] Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.257828 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78948969fd-2cvpt"] Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.328727 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee1d353b-d53e-4876-81e3-f473e25be987-serving-cert\") pod \"route-controller-manager-78948969fd-2cvpt\" (UID: \"ee1d353b-d53e-4876-81e3-f473e25be987\") " pod="openshift-route-controller-manager/route-controller-manager-78948969fd-2cvpt" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.328780 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/42e8086a-ace4-4184-81be-2bb6474bd4f1-proxy-ca-bundles\") pod \"controller-manager-6884cfc866-4rqsh\" (UID: \"42e8086a-ace4-4184-81be-2bb6474bd4f1\") " pod="openshift-controller-manager/controller-manager-6884cfc866-4rqsh" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.328823 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee1d353b-d53e-4876-81e3-f473e25be987-client-ca\") pod \"route-controller-manager-78948969fd-2cvpt\" (UID: \"ee1d353b-d53e-4876-81e3-f473e25be987\") " pod="openshift-route-controller-manager/route-controller-manager-78948969fd-2cvpt" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.328845 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvrg5\" (UniqueName: \"kubernetes.io/projected/ee1d353b-d53e-4876-81e3-f473e25be987-kube-api-access-rvrg5\") pod \"route-controller-manager-78948969fd-2cvpt\" (UID: \"ee1d353b-d53e-4876-81e3-f473e25be987\") " pod="openshift-route-controller-manager/route-controller-manager-78948969fd-2cvpt" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.328974 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42e8086a-ace4-4184-81be-2bb6474bd4f1-config\") pod \"controller-manager-6884cfc866-4rqsh\" (UID: \"42e8086a-ace4-4184-81be-2bb6474bd4f1\") " pod="openshift-controller-manager/controller-manager-6884cfc866-4rqsh" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.329136 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/42e8086a-ace4-4184-81be-2bb6474bd4f1-client-ca\") pod \"controller-manager-6884cfc866-4rqsh\" (UID: \"42e8086a-ace4-4184-81be-2bb6474bd4f1\") " pod="openshift-controller-manager/controller-manager-6884cfc866-4rqsh" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.329406 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42e8086a-ace4-4184-81be-2bb6474bd4f1-serving-cert\") pod \"controller-manager-6884cfc866-4rqsh\" (UID: \"42e8086a-ace4-4184-81be-2bb6474bd4f1\") " pod="openshift-controller-manager/controller-manager-6884cfc866-4rqsh" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.329519 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee1d353b-d53e-4876-81e3-f473e25be987-config\") pod \"route-controller-manager-78948969fd-2cvpt\" (UID: \"ee1d353b-d53e-4876-81e3-f473e25be987\") " pod="openshift-route-controller-manager/route-controller-manager-78948969fd-2cvpt" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.329548 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f47r4\" (UniqueName: \"kubernetes.io/projected/42e8086a-ace4-4184-81be-2bb6474bd4f1-kube-api-access-f47r4\") pod \"controller-manager-6884cfc866-4rqsh\" (UID: \"42e8086a-ace4-4184-81be-2bb6474bd4f1\") " pod="openshift-controller-manager/controller-manager-6884cfc866-4rqsh" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.420980 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6884cfc866-4rqsh"] Dec 01 20:01:35 crc kubenswrapper[4802]: E1201 20:01:35.421356 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-f47r4 proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-6884cfc866-4rqsh" podUID="42e8086a-ace4-4184-81be-2bb6474bd4f1" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.435415 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee1d353b-d53e-4876-81e3-f473e25be987-serving-cert\") pod \"route-controller-manager-78948969fd-2cvpt\" (UID: \"ee1d353b-d53e-4876-81e3-f473e25be987\") " pod="openshift-route-controller-manager/route-controller-manager-78948969fd-2cvpt" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.435788 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/42e8086a-ace4-4184-81be-2bb6474bd4f1-proxy-ca-bundles\") pod \"controller-manager-6884cfc866-4rqsh\" (UID: \"42e8086a-ace4-4184-81be-2bb6474bd4f1\") " pod="openshift-controller-manager/controller-manager-6884cfc866-4rqsh" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.435922 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee1d353b-d53e-4876-81e3-f473e25be987-client-ca\") pod \"route-controller-manager-78948969fd-2cvpt\" (UID: \"ee1d353b-d53e-4876-81e3-f473e25be987\") " pod="openshift-route-controller-manager/route-controller-manager-78948969fd-2cvpt" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.436038 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvrg5\" (UniqueName: \"kubernetes.io/projected/ee1d353b-d53e-4876-81e3-f473e25be987-kube-api-access-rvrg5\") pod \"route-controller-manager-78948969fd-2cvpt\" (UID: \"ee1d353b-d53e-4876-81e3-f473e25be987\") " pod="openshift-route-controller-manager/route-controller-manager-78948969fd-2cvpt" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.436161 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42e8086a-ace4-4184-81be-2bb6474bd4f1-config\") pod \"controller-manager-6884cfc866-4rqsh\" (UID: \"42e8086a-ace4-4184-81be-2bb6474bd4f1\") " pod="openshift-controller-manager/controller-manager-6884cfc866-4rqsh" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.436356 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/42e8086a-ace4-4184-81be-2bb6474bd4f1-client-ca\") pod \"controller-manager-6884cfc866-4rqsh\" (UID: \"42e8086a-ace4-4184-81be-2bb6474bd4f1\") " pod="openshift-controller-manager/controller-manager-6884cfc866-4rqsh" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.436507 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42e8086a-ace4-4184-81be-2bb6474bd4f1-serving-cert\") pod \"controller-manager-6884cfc866-4rqsh\" (UID: \"42e8086a-ace4-4184-81be-2bb6474bd4f1\") " pod="openshift-controller-manager/controller-manager-6884cfc866-4rqsh" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.436594 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f47r4\" (UniqueName: \"kubernetes.io/projected/42e8086a-ace4-4184-81be-2bb6474bd4f1-kube-api-access-f47r4\") pod \"controller-manager-6884cfc866-4rqsh\" (UID: \"42e8086a-ace4-4184-81be-2bb6474bd4f1\") " pod="openshift-controller-manager/controller-manager-6884cfc866-4rqsh" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.436696 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee1d353b-d53e-4876-81e3-f473e25be987-config\") pod \"route-controller-manager-78948969fd-2cvpt\" (UID: \"ee1d353b-d53e-4876-81e3-f473e25be987\") " pod="openshift-route-controller-manager/route-controller-manager-78948969fd-2cvpt" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.439370 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee1d353b-d53e-4876-81e3-f473e25be987-config\") pod \"route-controller-manager-78948969fd-2cvpt\" (UID: \"ee1d353b-d53e-4876-81e3-f473e25be987\") " pod="openshift-route-controller-manager/route-controller-manager-78948969fd-2cvpt" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.439749 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee1d353b-d53e-4876-81e3-f473e25be987-client-ca\") pod \"route-controller-manager-78948969fd-2cvpt\" (UID: \"ee1d353b-d53e-4876-81e3-f473e25be987\") " pod="openshift-route-controller-manager/route-controller-manager-78948969fd-2cvpt" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.443799 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78948969fd-2cvpt"] Dec 01 20:01:35 crc kubenswrapper[4802]: E1201 20:01:35.444480 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-rvrg5 serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-78948969fd-2cvpt" podUID="ee1d353b-d53e-4876-81e3-f473e25be987" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.454081 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee1d353b-d53e-4876-81e3-f473e25be987-serving-cert\") pod \"route-controller-manager-78948969fd-2cvpt\" (UID: \"ee1d353b-d53e-4876-81e3-f473e25be987\") " pod="openshift-route-controller-manager/route-controller-manager-78948969fd-2cvpt" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.459513 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42e8086a-ace4-4184-81be-2bb6474bd4f1-config\") pod \"controller-manager-6884cfc866-4rqsh\" (UID: \"42e8086a-ace4-4184-81be-2bb6474bd4f1\") " pod="openshift-controller-manager/controller-manager-6884cfc866-4rqsh" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.459597 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/42e8086a-ace4-4184-81be-2bb6474bd4f1-client-ca\") pod \"controller-manager-6884cfc866-4rqsh\" (UID: \"42e8086a-ace4-4184-81be-2bb6474bd4f1\") " pod="openshift-controller-manager/controller-manager-6884cfc866-4rqsh" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.460210 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/42e8086a-ace4-4184-81be-2bb6474bd4f1-proxy-ca-bundles\") pod \"controller-manager-6884cfc866-4rqsh\" (UID: \"42e8086a-ace4-4184-81be-2bb6474bd4f1\") " pod="openshift-controller-manager/controller-manager-6884cfc866-4rqsh" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.460256 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42e8086a-ace4-4184-81be-2bb6474bd4f1-serving-cert\") pod \"controller-manager-6884cfc866-4rqsh\" (UID: \"42e8086a-ace4-4184-81be-2bb6474bd4f1\") " pod="openshift-controller-manager/controller-manager-6884cfc866-4rqsh" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.464920 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvrg5\" (UniqueName: \"kubernetes.io/projected/ee1d353b-d53e-4876-81e3-f473e25be987-kube-api-access-rvrg5\") pod \"route-controller-manager-78948969fd-2cvpt\" (UID: \"ee1d353b-d53e-4876-81e3-f473e25be987\") " pod="openshift-route-controller-manager/route-controller-manager-78948969fd-2cvpt" Dec 01 20:01:35 crc kubenswrapper[4802]: I1201 20:01:35.465433 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f47r4\" (UniqueName: \"kubernetes.io/projected/42e8086a-ace4-4184-81be-2bb6474bd4f1-kube-api-access-f47r4\") pod \"controller-manager-6884cfc866-4rqsh\" (UID: \"42e8086a-ace4-4184-81be-2bb6474bd4f1\") " pod="openshift-controller-manager/controller-manager-6884cfc866-4rqsh" Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.001687 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78948969fd-2cvpt" Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.001830 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6884cfc866-4rqsh" Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.013362 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78948969fd-2cvpt" Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.020058 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6884cfc866-4rqsh" Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.146277 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/42e8086a-ace4-4184-81be-2bb6474bd4f1-proxy-ca-bundles\") pod \"42e8086a-ace4-4184-81be-2bb6474bd4f1\" (UID: \"42e8086a-ace4-4184-81be-2bb6474bd4f1\") " Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.146383 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee1d353b-d53e-4876-81e3-f473e25be987-serving-cert\") pod \"ee1d353b-d53e-4876-81e3-f473e25be987\" (UID: \"ee1d353b-d53e-4876-81e3-f473e25be987\") " Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.146449 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42e8086a-ace4-4184-81be-2bb6474bd4f1-config\") pod \"42e8086a-ace4-4184-81be-2bb6474bd4f1\" (UID: \"42e8086a-ace4-4184-81be-2bb6474bd4f1\") " Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.146528 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee1d353b-d53e-4876-81e3-f473e25be987-client-ca\") pod \"ee1d353b-d53e-4876-81e3-f473e25be987\" (UID: \"ee1d353b-d53e-4876-81e3-f473e25be987\") " Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.146583 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f47r4\" (UniqueName: \"kubernetes.io/projected/42e8086a-ace4-4184-81be-2bb6474bd4f1-kube-api-access-f47r4\") pod \"42e8086a-ace4-4184-81be-2bb6474bd4f1\" (UID: \"42e8086a-ace4-4184-81be-2bb6474bd4f1\") " Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.146665 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee1d353b-d53e-4876-81e3-f473e25be987-config\") pod \"ee1d353b-d53e-4876-81e3-f473e25be987\" (UID: \"ee1d353b-d53e-4876-81e3-f473e25be987\") " Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.146742 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/42e8086a-ace4-4184-81be-2bb6474bd4f1-client-ca\") pod \"42e8086a-ace4-4184-81be-2bb6474bd4f1\" (UID: \"42e8086a-ace4-4184-81be-2bb6474bd4f1\") " Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.146845 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42e8086a-ace4-4184-81be-2bb6474bd4f1-serving-cert\") pod \"42e8086a-ace4-4184-81be-2bb6474bd4f1\" (UID: \"42e8086a-ace4-4184-81be-2bb6474bd4f1\") " Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.146925 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvrg5\" (UniqueName: \"kubernetes.io/projected/ee1d353b-d53e-4876-81e3-f473e25be987-kube-api-access-rvrg5\") pod \"ee1d353b-d53e-4876-81e3-f473e25be987\" (UID: \"ee1d353b-d53e-4876-81e3-f473e25be987\") " Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.147241 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42e8086a-ace4-4184-81be-2bb6474bd4f1-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "42e8086a-ace4-4184-81be-2bb6474bd4f1" (UID: "42e8086a-ace4-4184-81be-2bb6474bd4f1"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.147295 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42e8086a-ace4-4184-81be-2bb6474bd4f1-client-ca" (OuterVolumeSpecName: "client-ca") pod "42e8086a-ace4-4184-81be-2bb6474bd4f1" (UID: "42e8086a-ace4-4184-81be-2bb6474bd4f1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.147388 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42e8086a-ace4-4184-81be-2bb6474bd4f1-config" (OuterVolumeSpecName: "config") pod "42e8086a-ace4-4184-81be-2bb6474bd4f1" (UID: "42e8086a-ace4-4184-81be-2bb6474bd4f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.147487 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee1d353b-d53e-4876-81e3-f473e25be987-config" (OuterVolumeSpecName: "config") pod "ee1d353b-d53e-4876-81e3-f473e25be987" (UID: "ee1d353b-d53e-4876-81e3-f473e25be987"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.147622 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee1d353b-d53e-4876-81e3-f473e25be987-client-ca" (OuterVolumeSpecName: "client-ca") pod "ee1d353b-d53e-4876-81e3-f473e25be987" (UID: "ee1d353b-d53e-4876-81e3-f473e25be987"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.150453 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42e8086a-ace4-4184-81be-2bb6474bd4f1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "42e8086a-ace4-4184-81be-2bb6474bd4f1" (UID: "42e8086a-ace4-4184-81be-2bb6474bd4f1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.150463 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee1d353b-d53e-4876-81e3-f473e25be987-kube-api-access-rvrg5" (OuterVolumeSpecName: "kube-api-access-rvrg5") pod "ee1d353b-d53e-4876-81e3-f473e25be987" (UID: "ee1d353b-d53e-4876-81e3-f473e25be987"). InnerVolumeSpecName "kube-api-access-rvrg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.151071 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42e8086a-ace4-4184-81be-2bb6474bd4f1-kube-api-access-f47r4" (OuterVolumeSpecName: "kube-api-access-f47r4") pod "42e8086a-ace4-4184-81be-2bb6474bd4f1" (UID: "42e8086a-ace4-4184-81be-2bb6474bd4f1"). InnerVolumeSpecName "kube-api-access-f47r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.152574 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee1d353b-d53e-4876-81e3-f473e25be987-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ee1d353b-d53e-4876-81e3-f473e25be987" (UID: "ee1d353b-d53e-4876-81e3-f473e25be987"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.247923 4802 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee1d353b-d53e-4876-81e3-f473e25be987-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.247974 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f47r4\" (UniqueName: \"kubernetes.io/projected/42e8086a-ace4-4184-81be-2bb6474bd4f1-kube-api-access-f47r4\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.247990 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee1d353b-d53e-4876-81e3-f473e25be987-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.248003 4802 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/42e8086a-ace4-4184-81be-2bb6474bd4f1-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.248015 4802 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42e8086a-ace4-4184-81be-2bb6474bd4f1-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.248026 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvrg5\" (UniqueName: \"kubernetes.io/projected/ee1d353b-d53e-4876-81e3-f473e25be987-kube-api-access-rvrg5\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.248038 4802 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/42e8086a-ace4-4184-81be-2bb6474bd4f1-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.248049 4802 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee1d353b-d53e-4876-81e3-f473e25be987-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.248062 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42e8086a-ace4-4184-81be-2bb6474bd4f1-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:36 crc kubenswrapper[4802]: I1201 20:01:36.733270 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56c261dd-49c4-4f69-9400-7a012e281b7b" path="/var/lib/kubelet/pods/56c261dd-49c4-4f69-9400-7a012e281b7b/volumes" Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.005799 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78948969fd-2cvpt" Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.005804 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6884cfc866-4rqsh" Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.041714 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p"] Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.042583 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p" Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.045520 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.045748 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.045947 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.046063 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.046331 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.046456 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.047260 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78948969fd-2cvpt"] Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.057418 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p"] Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.061325 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78948969fd-2cvpt"] Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.083255 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6884cfc866-4rqsh"] Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.086582 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6884cfc866-4rqsh"] Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.161156 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ab4c6b8-aaa4-4e60-ab6e-f811b432c944-serving-cert\") pod \"route-controller-manager-776ddd45f7-ssd8p\" (UID: \"1ab4c6b8-aaa4-4e60-ab6e-f811b432c944\") " pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p" Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.161235 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ab4c6b8-aaa4-4e60-ab6e-f811b432c944-config\") pod \"route-controller-manager-776ddd45f7-ssd8p\" (UID: \"1ab4c6b8-aaa4-4e60-ab6e-f811b432c944\") " pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p" Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.161258 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87s54\" (UniqueName: \"kubernetes.io/projected/1ab4c6b8-aaa4-4e60-ab6e-f811b432c944-kube-api-access-87s54\") pod \"route-controller-manager-776ddd45f7-ssd8p\" (UID: \"1ab4c6b8-aaa4-4e60-ab6e-f811b432c944\") " pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p" Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.161278 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ab4c6b8-aaa4-4e60-ab6e-f811b432c944-client-ca\") pod \"route-controller-manager-776ddd45f7-ssd8p\" (UID: \"1ab4c6b8-aaa4-4e60-ab6e-f811b432c944\") " pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p" Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.262790 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ab4c6b8-aaa4-4e60-ab6e-f811b432c944-serving-cert\") pod \"route-controller-manager-776ddd45f7-ssd8p\" (UID: \"1ab4c6b8-aaa4-4e60-ab6e-f811b432c944\") " pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p" Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.262865 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ab4c6b8-aaa4-4e60-ab6e-f811b432c944-config\") pod \"route-controller-manager-776ddd45f7-ssd8p\" (UID: \"1ab4c6b8-aaa4-4e60-ab6e-f811b432c944\") " pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p" Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.262897 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87s54\" (UniqueName: \"kubernetes.io/projected/1ab4c6b8-aaa4-4e60-ab6e-f811b432c944-kube-api-access-87s54\") pod \"route-controller-manager-776ddd45f7-ssd8p\" (UID: \"1ab4c6b8-aaa4-4e60-ab6e-f811b432c944\") " pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p" Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.262923 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ab4c6b8-aaa4-4e60-ab6e-f811b432c944-client-ca\") pod \"route-controller-manager-776ddd45f7-ssd8p\" (UID: \"1ab4c6b8-aaa4-4e60-ab6e-f811b432c944\") " pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p" Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.264019 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ab4c6b8-aaa4-4e60-ab6e-f811b432c944-client-ca\") pod \"route-controller-manager-776ddd45f7-ssd8p\" (UID: \"1ab4c6b8-aaa4-4e60-ab6e-f811b432c944\") " pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p" Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.264629 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ab4c6b8-aaa4-4e60-ab6e-f811b432c944-config\") pod \"route-controller-manager-776ddd45f7-ssd8p\" (UID: \"1ab4c6b8-aaa4-4e60-ab6e-f811b432c944\") " pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p" Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.272827 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ab4c6b8-aaa4-4e60-ab6e-f811b432c944-serving-cert\") pod \"route-controller-manager-776ddd45f7-ssd8p\" (UID: \"1ab4c6b8-aaa4-4e60-ab6e-f811b432c944\") " pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p" Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.279144 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87s54\" (UniqueName: \"kubernetes.io/projected/1ab4c6b8-aaa4-4e60-ab6e-f811b432c944-kube-api-access-87s54\") pod \"route-controller-manager-776ddd45f7-ssd8p\" (UID: \"1ab4c6b8-aaa4-4e60-ab6e-f811b432c944\") " pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p" Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.358377 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p" Dec 01 20:01:37 crc kubenswrapper[4802]: I1201 20:01:37.595801 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p"] Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.012047 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p" event={"ID":"1ab4c6b8-aaa4-4e60-ab6e-f811b432c944","Type":"ContainerStarted","Data":"1c2bf8b3489ee95f383ddac8e49e23dd8b3b670c6774dd242f0bad9443b3fd70"} Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.012472 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p" Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.012516 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p" event={"ID":"1ab4c6b8-aaa4-4e60-ab6e-f811b432c944","Type":"ContainerStarted","Data":"8e6bb11632ba16e4ded76e588779ee040367983fc89f03f79b1615f71131e4cd"} Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.033554 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p" podStartSLOduration=3.033525876 podStartE2EDuration="3.033525876s" podCreationTimestamp="2025-12-01 20:01:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:01:38.029258013 +0000 UTC m=+319.591817654" watchObservedRunningTime="2025-12-01 20:01:38.033525876 +0000 UTC m=+319.596085517" Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.125547 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw"] Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.126148 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw" Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.128996 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.129337 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.129400 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.131589 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.133384 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.134599 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.153604 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.157794 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw"] Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.176144 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93d41581-b2d3-42b1-a5f2-ad002af6b122-serving-cert\") pod \"controller-manager-7dbb8cdfc9-fzjlw\" (UID: \"93d41581-b2d3-42b1-a5f2-ad002af6b122\") " pod="openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw" Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.176229 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/93d41581-b2d3-42b1-a5f2-ad002af6b122-client-ca\") pod \"controller-manager-7dbb8cdfc9-fzjlw\" (UID: \"93d41581-b2d3-42b1-a5f2-ad002af6b122\") " pod="openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw" Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.176257 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvpkj\" (UniqueName: \"kubernetes.io/projected/93d41581-b2d3-42b1-a5f2-ad002af6b122-kube-api-access-nvpkj\") pod \"controller-manager-7dbb8cdfc9-fzjlw\" (UID: \"93d41581-b2d3-42b1-a5f2-ad002af6b122\") " pod="openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw" Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.176273 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/93d41581-b2d3-42b1-a5f2-ad002af6b122-proxy-ca-bundles\") pod \"controller-manager-7dbb8cdfc9-fzjlw\" (UID: \"93d41581-b2d3-42b1-a5f2-ad002af6b122\") " pod="openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw" Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.176330 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93d41581-b2d3-42b1-a5f2-ad002af6b122-config\") pod \"controller-manager-7dbb8cdfc9-fzjlw\" (UID: \"93d41581-b2d3-42b1-a5f2-ad002af6b122\") " pod="openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw" Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.192551 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p" Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.197942 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p"] Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.277018 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93d41581-b2d3-42b1-a5f2-ad002af6b122-serving-cert\") pod \"controller-manager-7dbb8cdfc9-fzjlw\" (UID: \"93d41581-b2d3-42b1-a5f2-ad002af6b122\") " pod="openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw" Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.277068 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/93d41581-b2d3-42b1-a5f2-ad002af6b122-client-ca\") pod \"controller-manager-7dbb8cdfc9-fzjlw\" (UID: \"93d41581-b2d3-42b1-a5f2-ad002af6b122\") " pod="openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw" Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.277096 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/93d41581-b2d3-42b1-a5f2-ad002af6b122-proxy-ca-bundles\") pod \"controller-manager-7dbb8cdfc9-fzjlw\" (UID: \"93d41581-b2d3-42b1-a5f2-ad002af6b122\") " pod="openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw" Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.277113 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvpkj\" (UniqueName: \"kubernetes.io/projected/93d41581-b2d3-42b1-a5f2-ad002af6b122-kube-api-access-nvpkj\") pod \"controller-manager-7dbb8cdfc9-fzjlw\" (UID: \"93d41581-b2d3-42b1-a5f2-ad002af6b122\") " pod="openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw" Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.277148 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93d41581-b2d3-42b1-a5f2-ad002af6b122-config\") pod \"controller-manager-7dbb8cdfc9-fzjlw\" (UID: \"93d41581-b2d3-42b1-a5f2-ad002af6b122\") " pod="openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw" Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.278340 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/93d41581-b2d3-42b1-a5f2-ad002af6b122-client-ca\") pod \"controller-manager-7dbb8cdfc9-fzjlw\" (UID: \"93d41581-b2d3-42b1-a5f2-ad002af6b122\") " pod="openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw" Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.278521 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93d41581-b2d3-42b1-a5f2-ad002af6b122-config\") pod \"controller-manager-7dbb8cdfc9-fzjlw\" (UID: \"93d41581-b2d3-42b1-a5f2-ad002af6b122\") " pod="openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw" Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.279234 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/93d41581-b2d3-42b1-a5f2-ad002af6b122-proxy-ca-bundles\") pod \"controller-manager-7dbb8cdfc9-fzjlw\" (UID: \"93d41581-b2d3-42b1-a5f2-ad002af6b122\") " pod="openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw" Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.286058 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93d41581-b2d3-42b1-a5f2-ad002af6b122-serving-cert\") pod \"controller-manager-7dbb8cdfc9-fzjlw\" (UID: \"93d41581-b2d3-42b1-a5f2-ad002af6b122\") " pod="openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw" Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.299459 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvpkj\" (UniqueName: \"kubernetes.io/projected/93d41581-b2d3-42b1-a5f2-ad002af6b122-kube-api-access-nvpkj\") pod \"controller-manager-7dbb8cdfc9-fzjlw\" (UID: \"93d41581-b2d3-42b1-a5f2-ad002af6b122\") " pod="openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw" Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.446487 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw" Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.684300 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw"] Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.730918 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42e8086a-ace4-4184-81be-2bb6474bd4f1" path="/var/lib/kubelet/pods/42e8086a-ace4-4184-81be-2bb6474bd4f1/volumes" Dec 01 20:01:38 crc kubenswrapper[4802]: I1201 20:01:38.731541 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee1d353b-d53e-4876-81e3-f473e25be987" path="/var/lib/kubelet/pods/ee1d353b-d53e-4876-81e3-f473e25be987/volumes" Dec 01 20:01:39 crc kubenswrapper[4802]: I1201 20:01:39.018410 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw" event={"ID":"93d41581-b2d3-42b1-a5f2-ad002af6b122","Type":"ContainerStarted","Data":"375e51e3f9ce3a2a40484e223e15ed5e72f2f7394c294d03e00f26e37caf2b38"} Dec 01 20:01:39 crc kubenswrapper[4802]: I1201 20:01:39.018455 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw" event={"ID":"93d41581-b2d3-42b1-a5f2-ad002af6b122","Type":"ContainerStarted","Data":"9f705b984a29bedf9016f607c20f6859651ff43bfedc7a4bd4f167e830b85db0"} Dec 01 20:01:39 crc kubenswrapper[4802]: I1201 20:01:39.040729 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw" podStartSLOduration=1.040708931 podStartE2EDuration="1.040708931s" podCreationTimestamp="2025-12-01 20:01:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:01:39.03679692 +0000 UTC m=+320.599356571" watchObservedRunningTime="2025-12-01 20:01:39.040708931 +0000 UTC m=+320.603268572" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.023989 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p" podUID="1ab4c6b8-aaa4-4e60-ab6e-f811b432c944" containerName="route-controller-manager" containerID="cri-o://1c2bf8b3489ee95f383ddac8e49e23dd8b3b670c6774dd242f0bad9443b3fd70" gracePeriod=30 Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.024262 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.031162 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.363018 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.383717 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj"] Dec 01 20:01:40 crc kubenswrapper[4802]: E1201 20:01:40.383937 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ab4c6b8-aaa4-4e60-ab6e-f811b432c944" containerName="route-controller-manager" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.383955 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ab4c6b8-aaa4-4e60-ab6e-f811b432c944" containerName="route-controller-manager" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.384060 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ab4c6b8-aaa4-4e60-ab6e-f811b432c944" containerName="route-controller-manager" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.384433 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.394847 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj"] Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.503372 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ab4c6b8-aaa4-4e60-ab6e-f811b432c944-client-ca\") pod \"1ab4c6b8-aaa4-4e60-ab6e-f811b432c944\" (UID: \"1ab4c6b8-aaa4-4e60-ab6e-f811b432c944\") " Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.503488 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ab4c6b8-aaa4-4e60-ab6e-f811b432c944-serving-cert\") pod \"1ab4c6b8-aaa4-4e60-ab6e-f811b432c944\" (UID: \"1ab4c6b8-aaa4-4e60-ab6e-f811b432c944\") " Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.503551 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87s54\" (UniqueName: \"kubernetes.io/projected/1ab4c6b8-aaa4-4e60-ab6e-f811b432c944-kube-api-access-87s54\") pod \"1ab4c6b8-aaa4-4e60-ab6e-f811b432c944\" (UID: \"1ab4c6b8-aaa4-4e60-ab6e-f811b432c944\") " Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.503609 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ab4c6b8-aaa4-4e60-ab6e-f811b432c944-config\") pod \"1ab4c6b8-aaa4-4e60-ab6e-f811b432c944\" (UID: \"1ab4c6b8-aaa4-4e60-ab6e-f811b432c944\") " Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.503691 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f-client-ca\") pod \"route-controller-manager-559846b6c5-9kxqj\" (UID: \"10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f\") " pod="openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.503723 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f-serving-cert\") pod \"route-controller-manager-559846b6c5-9kxqj\" (UID: \"10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f\") " pod="openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.503829 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz6zj\" (UniqueName: \"kubernetes.io/projected/10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f-kube-api-access-qz6zj\") pod \"route-controller-manager-559846b6c5-9kxqj\" (UID: \"10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f\") " pod="openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.503923 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f-config\") pod \"route-controller-manager-559846b6c5-9kxqj\" (UID: \"10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f\") " pod="openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.504036 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ab4c6b8-aaa4-4e60-ab6e-f811b432c944-client-ca" (OuterVolumeSpecName: "client-ca") pod "1ab4c6b8-aaa4-4e60-ab6e-f811b432c944" (UID: "1ab4c6b8-aaa4-4e60-ab6e-f811b432c944"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.504148 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ab4c6b8-aaa4-4e60-ab6e-f811b432c944-config" (OuterVolumeSpecName: "config") pod "1ab4c6b8-aaa4-4e60-ab6e-f811b432c944" (UID: "1ab4c6b8-aaa4-4e60-ab6e-f811b432c944"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.508758 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ab4c6b8-aaa4-4e60-ab6e-f811b432c944-kube-api-access-87s54" (OuterVolumeSpecName: "kube-api-access-87s54") pod "1ab4c6b8-aaa4-4e60-ab6e-f811b432c944" (UID: "1ab4c6b8-aaa4-4e60-ab6e-f811b432c944"). InnerVolumeSpecName "kube-api-access-87s54". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.509158 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ab4c6b8-aaa4-4e60-ab6e-f811b432c944-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1ab4c6b8-aaa4-4e60-ab6e-f811b432c944" (UID: "1ab4c6b8-aaa4-4e60-ab6e-f811b432c944"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.605342 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f-client-ca\") pod \"route-controller-manager-559846b6c5-9kxqj\" (UID: \"10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f\") " pod="openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.605395 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f-serving-cert\") pod \"route-controller-manager-559846b6c5-9kxqj\" (UID: \"10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f\") " pod="openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.605432 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz6zj\" (UniqueName: \"kubernetes.io/projected/10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f-kube-api-access-qz6zj\") pod \"route-controller-manager-559846b6c5-9kxqj\" (UID: \"10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f\") " pod="openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.605482 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f-config\") pod \"route-controller-manager-559846b6c5-9kxqj\" (UID: \"10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f\") " pod="openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.605520 4802 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ab4c6b8-aaa4-4e60-ab6e-f811b432c944-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.605533 4802 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ab4c6b8-aaa4-4e60-ab6e-f811b432c944-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.605542 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87s54\" (UniqueName: \"kubernetes.io/projected/1ab4c6b8-aaa4-4e60-ab6e-f811b432c944-kube-api-access-87s54\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.605552 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ab4c6b8-aaa4-4e60-ab6e-f811b432c944-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.606236 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f-client-ca\") pod \"route-controller-manager-559846b6c5-9kxqj\" (UID: \"10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f\") " pod="openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.606528 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f-config\") pod \"route-controller-manager-559846b6c5-9kxqj\" (UID: \"10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f\") " pod="openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.613181 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f-serving-cert\") pod \"route-controller-manager-559846b6c5-9kxqj\" (UID: \"10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f\") " pod="openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.624683 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz6zj\" (UniqueName: \"kubernetes.io/projected/10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f-kube-api-access-qz6zj\") pod \"route-controller-manager-559846b6c5-9kxqj\" (UID: \"10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f\") " pod="openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.696339 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj" Dec 01 20:01:40 crc kubenswrapper[4802]: I1201 20:01:40.945959 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj"] Dec 01 20:01:40 crc kubenswrapper[4802]: W1201 20:01:40.956839 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10ff1dc7_b9d3_4fe6_a5be_a61bb8df887f.slice/crio-62cb104b1f74197214886f580de6be82d276560c062840c6cf347423d43acb89 WatchSource:0}: Error finding container 62cb104b1f74197214886f580de6be82d276560c062840c6cf347423d43acb89: Status 404 returned error can't find the container with id 62cb104b1f74197214886f580de6be82d276560c062840c6cf347423d43acb89 Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.030167 4802 generic.go:334] "Generic (PLEG): container finished" podID="1ab4c6b8-aaa4-4e60-ab6e-f811b432c944" containerID="1c2bf8b3489ee95f383ddac8e49e23dd8b3b670c6774dd242f0bad9443b3fd70" exitCode=0 Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.030289 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.030799 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p" event={"ID":"1ab4c6b8-aaa4-4e60-ab6e-f811b432c944","Type":"ContainerDied","Data":"1c2bf8b3489ee95f383ddac8e49e23dd8b3b670c6774dd242f0bad9443b3fd70"} Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.030826 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p" event={"ID":"1ab4c6b8-aaa4-4e60-ab6e-f811b432c944","Type":"ContainerDied","Data":"8e6bb11632ba16e4ded76e588779ee040367983fc89f03f79b1615f71131e4cd"} Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.030841 4802 scope.go:117] "RemoveContainer" containerID="1c2bf8b3489ee95f383ddac8e49e23dd8b3b670c6774dd242f0bad9443b3fd70" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.033752 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj" event={"ID":"10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f","Type":"ContainerStarted","Data":"62cb104b1f74197214886f580de6be82d276560c062840c6cf347423d43acb89"} Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.048474 4802 scope.go:117] "RemoveContainer" containerID="1c2bf8b3489ee95f383ddac8e49e23dd8b3b670c6774dd242f0bad9443b3fd70" Dec 01 20:01:41 crc kubenswrapper[4802]: E1201 20:01:41.048930 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c2bf8b3489ee95f383ddac8e49e23dd8b3b670c6774dd242f0bad9443b3fd70\": container with ID starting with 1c2bf8b3489ee95f383ddac8e49e23dd8b3b670c6774dd242f0bad9443b3fd70 not found: ID does not exist" containerID="1c2bf8b3489ee95f383ddac8e49e23dd8b3b670c6774dd242f0bad9443b3fd70" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.048975 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c2bf8b3489ee95f383ddac8e49e23dd8b3b670c6774dd242f0bad9443b3fd70"} err="failed to get container status \"1c2bf8b3489ee95f383ddac8e49e23dd8b3b670c6774dd242f0bad9443b3fd70\": rpc error: code = NotFound desc = could not find container \"1c2bf8b3489ee95f383ddac8e49e23dd8b3b670c6774dd242f0bad9443b3fd70\": container with ID starting with 1c2bf8b3489ee95f383ddac8e49e23dd8b3b670c6774dd242f0bad9443b3fd70 not found: ID does not exist" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.049810 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p"] Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.053777 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776ddd45f7-ssd8p"] Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.311142 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-f9rgf"] Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.312404 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.323704 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-f9rgf"] Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.413949 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/39deb742-86b6-49e7-9b16-8c3944d28dc7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-f9rgf\" (UID: \"39deb742-86b6-49e7-9b16-8c3944d28dc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.414011 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s5zr\" (UniqueName: \"kubernetes.io/projected/39deb742-86b6-49e7-9b16-8c3944d28dc7-kube-api-access-9s5zr\") pod \"image-registry-66df7c8f76-f9rgf\" (UID: \"39deb742-86b6-49e7-9b16-8c3944d28dc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.414038 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/39deb742-86b6-49e7-9b16-8c3944d28dc7-registry-tls\") pod \"image-registry-66df7c8f76-f9rgf\" (UID: \"39deb742-86b6-49e7-9b16-8c3944d28dc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.414239 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39deb742-86b6-49e7-9b16-8c3944d28dc7-trusted-ca\") pod \"image-registry-66df7c8f76-f9rgf\" (UID: \"39deb742-86b6-49e7-9b16-8c3944d28dc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.414380 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/39deb742-86b6-49e7-9b16-8c3944d28dc7-registry-certificates\") pod \"image-registry-66df7c8f76-f9rgf\" (UID: \"39deb742-86b6-49e7-9b16-8c3944d28dc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.414572 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/39deb742-86b6-49e7-9b16-8c3944d28dc7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-f9rgf\" (UID: \"39deb742-86b6-49e7-9b16-8c3944d28dc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.414680 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-f9rgf\" (UID: \"39deb742-86b6-49e7-9b16-8c3944d28dc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.414710 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39deb742-86b6-49e7-9b16-8c3944d28dc7-bound-sa-token\") pod \"image-registry-66df7c8f76-f9rgf\" (UID: \"39deb742-86b6-49e7-9b16-8c3944d28dc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.439792 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-f9rgf\" (UID: \"39deb742-86b6-49e7-9b16-8c3944d28dc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.515859 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s5zr\" (UniqueName: \"kubernetes.io/projected/39deb742-86b6-49e7-9b16-8c3944d28dc7-kube-api-access-9s5zr\") pod \"image-registry-66df7c8f76-f9rgf\" (UID: \"39deb742-86b6-49e7-9b16-8c3944d28dc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.515911 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/39deb742-86b6-49e7-9b16-8c3944d28dc7-registry-tls\") pod \"image-registry-66df7c8f76-f9rgf\" (UID: \"39deb742-86b6-49e7-9b16-8c3944d28dc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.515944 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39deb742-86b6-49e7-9b16-8c3944d28dc7-trusted-ca\") pod \"image-registry-66df7c8f76-f9rgf\" (UID: \"39deb742-86b6-49e7-9b16-8c3944d28dc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.515971 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/39deb742-86b6-49e7-9b16-8c3944d28dc7-registry-certificates\") pod \"image-registry-66df7c8f76-f9rgf\" (UID: \"39deb742-86b6-49e7-9b16-8c3944d28dc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.516004 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/39deb742-86b6-49e7-9b16-8c3944d28dc7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-f9rgf\" (UID: \"39deb742-86b6-49e7-9b16-8c3944d28dc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.516045 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39deb742-86b6-49e7-9b16-8c3944d28dc7-bound-sa-token\") pod \"image-registry-66df7c8f76-f9rgf\" (UID: \"39deb742-86b6-49e7-9b16-8c3944d28dc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.516070 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/39deb742-86b6-49e7-9b16-8c3944d28dc7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-f9rgf\" (UID: \"39deb742-86b6-49e7-9b16-8c3944d28dc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.516737 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/39deb742-86b6-49e7-9b16-8c3944d28dc7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-f9rgf\" (UID: \"39deb742-86b6-49e7-9b16-8c3944d28dc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.517339 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39deb742-86b6-49e7-9b16-8c3944d28dc7-trusted-ca\") pod \"image-registry-66df7c8f76-f9rgf\" (UID: \"39deb742-86b6-49e7-9b16-8c3944d28dc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.517466 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/39deb742-86b6-49e7-9b16-8c3944d28dc7-registry-certificates\") pod \"image-registry-66df7c8f76-f9rgf\" (UID: \"39deb742-86b6-49e7-9b16-8c3944d28dc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.522138 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/39deb742-86b6-49e7-9b16-8c3944d28dc7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-f9rgf\" (UID: \"39deb742-86b6-49e7-9b16-8c3944d28dc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.522210 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/39deb742-86b6-49e7-9b16-8c3944d28dc7-registry-tls\") pod \"image-registry-66df7c8f76-f9rgf\" (UID: \"39deb742-86b6-49e7-9b16-8c3944d28dc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.530336 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s5zr\" (UniqueName: \"kubernetes.io/projected/39deb742-86b6-49e7-9b16-8c3944d28dc7-kube-api-access-9s5zr\") pod \"image-registry-66df7c8f76-f9rgf\" (UID: \"39deb742-86b6-49e7-9b16-8c3944d28dc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.531690 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39deb742-86b6-49e7-9b16-8c3944d28dc7-bound-sa-token\") pod \"image-registry-66df7c8f76-f9rgf\" (UID: \"39deb742-86b6-49e7-9b16-8c3944d28dc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.625300 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" Dec 01 20:01:41 crc kubenswrapper[4802]: I1201 20:01:41.812412 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-f9rgf"] Dec 01 20:01:41 crc kubenswrapper[4802]: W1201 20:01:41.818092 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39deb742_86b6_49e7_9b16_8c3944d28dc7.slice/crio-5da38efe2769d9f24b0401f6e89a68e37ea359ff81b37b040a808cf85057a654 WatchSource:0}: Error finding container 5da38efe2769d9f24b0401f6e89a68e37ea359ff81b37b040a808cf85057a654: Status 404 returned error can't find the container with id 5da38efe2769d9f24b0401f6e89a68e37ea359ff81b37b040a808cf85057a654 Dec 01 20:01:42 crc kubenswrapper[4802]: I1201 20:01:42.042745 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj" event={"ID":"10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f","Type":"ContainerStarted","Data":"8fa9bab85da1b5a8215a63b18643df367d0754ce1c64a2ee5c59240b01051223"} Dec 01 20:01:42 crc kubenswrapper[4802]: I1201 20:01:42.043118 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj" Dec 01 20:01:42 crc kubenswrapper[4802]: I1201 20:01:42.044941 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" event={"ID":"39deb742-86b6-49e7-9b16-8c3944d28dc7","Type":"ContainerStarted","Data":"654e0e5c6a19e2d108901af4772ec6967cfd11adbdee3602d50e4f3872e9ccc7"} Dec 01 20:01:42 crc kubenswrapper[4802]: I1201 20:01:42.044983 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" event={"ID":"39deb742-86b6-49e7-9b16-8c3944d28dc7","Type":"ContainerStarted","Data":"5da38efe2769d9f24b0401f6e89a68e37ea359ff81b37b040a808cf85057a654"} Dec 01 20:01:42 crc kubenswrapper[4802]: I1201 20:01:42.049125 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj" Dec 01 20:01:42 crc kubenswrapper[4802]: I1201 20:01:42.062360 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj" podStartSLOduration=4.062336832 podStartE2EDuration="4.062336832s" podCreationTimestamp="2025-12-01 20:01:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:01:42.05777838 +0000 UTC m=+323.620338021" watchObservedRunningTime="2025-12-01 20:01:42.062336832 +0000 UTC m=+323.624896493" Dec 01 20:01:42 crc kubenswrapper[4802]: I1201 20:01:42.076998 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" podStartSLOduration=1.076977004 podStartE2EDuration="1.076977004s" podCreationTimestamp="2025-12-01 20:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:01:42.075265172 +0000 UTC m=+323.637824823" watchObservedRunningTime="2025-12-01 20:01:42.076977004 +0000 UTC m=+323.639536645" Dec 01 20:01:42 crc kubenswrapper[4802]: I1201 20:01:42.729852 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ab4c6b8-aaa4-4e60-ab6e-f811b432c944" path="/var/lib/kubelet/pods/1ab4c6b8-aaa4-4e60-ab6e-f811b432c944/volumes" Dec 01 20:01:43 crc kubenswrapper[4802]: I1201 20:01:43.050419 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" Dec 01 20:01:53 crc kubenswrapper[4802]: I1201 20:01:53.611491 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw"] Dec 01 20:01:53 crc kubenswrapper[4802]: I1201 20:01:53.612211 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw" podUID="93d41581-b2d3-42b1-a5f2-ad002af6b122" containerName="controller-manager" containerID="cri-o://375e51e3f9ce3a2a40484e223e15ed5e72f2f7394c294d03e00f26e37caf2b38" gracePeriod=30 Dec 01 20:01:54 crc kubenswrapper[4802]: I1201 20:01:54.055636 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw" Dec 01 20:01:54 crc kubenswrapper[4802]: I1201 20:01:54.112108 4802 generic.go:334] "Generic (PLEG): container finished" podID="93d41581-b2d3-42b1-a5f2-ad002af6b122" containerID="375e51e3f9ce3a2a40484e223e15ed5e72f2f7394c294d03e00f26e37caf2b38" exitCode=0 Dec 01 20:01:54 crc kubenswrapper[4802]: I1201 20:01:54.112159 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw" event={"ID":"93d41581-b2d3-42b1-a5f2-ad002af6b122","Type":"ContainerDied","Data":"375e51e3f9ce3a2a40484e223e15ed5e72f2f7394c294d03e00f26e37caf2b38"} Dec 01 20:01:54 crc kubenswrapper[4802]: I1201 20:01:54.112184 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw" Dec 01 20:01:54 crc kubenswrapper[4802]: I1201 20:01:54.112231 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw" event={"ID":"93d41581-b2d3-42b1-a5f2-ad002af6b122","Type":"ContainerDied","Data":"9f705b984a29bedf9016f607c20f6859651ff43bfedc7a4bd4f167e830b85db0"} Dec 01 20:01:54 crc kubenswrapper[4802]: I1201 20:01:54.112257 4802 scope.go:117] "RemoveContainer" containerID="375e51e3f9ce3a2a40484e223e15ed5e72f2f7394c294d03e00f26e37caf2b38" Dec 01 20:01:54 crc kubenswrapper[4802]: I1201 20:01:54.128919 4802 scope.go:117] "RemoveContainer" containerID="375e51e3f9ce3a2a40484e223e15ed5e72f2f7394c294d03e00f26e37caf2b38" Dec 01 20:01:54 crc kubenswrapper[4802]: E1201 20:01:54.130650 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"375e51e3f9ce3a2a40484e223e15ed5e72f2f7394c294d03e00f26e37caf2b38\": container with ID starting with 375e51e3f9ce3a2a40484e223e15ed5e72f2f7394c294d03e00f26e37caf2b38 not found: ID does not exist" containerID="375e51e3f9ce3a2a40484e223e15ed5e72f2f7394c294d03e00f26e37caf2b38" Dec 01 20:01:54 crc kubenswrapper[4802]: I1201 20:01:54.130717 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"375e51e3f9ce3a2a40484e223e15ed5e72f2f7394c294d03e00f26e37caf2b38"} err="failed to get container status \"375e51e3f9ce3a2a40484e223e15ed5e72f2f7394c294d03e00f26e37caf2b38\": rpc error: code = NotFound desc = could not find container \"375e51e3f9ce3a2a40484e223e15ed5e72f2f7394c294d03e00f26e37caf2b38\": container with ID starting with 375e51e3f9ce3a2a40484e223e15ed5e72f2f7394c294d03e00f26e37caf2b38 not found: ID does not exist" Dec 01 20:01:54 crc kubenswrapper[4802]: I1201 20:01:54.184439 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvpkj\" (UniqueName: \"kubernetes.io/projected/93d41581-b2d3-42b1-a5f2-ad002af6b122-kube-api-access-nvpkj\") pod \"93d41581-b2d3-42b1-a5f2-ad002af6b122\" (UID: \"93d41581-b2d3-42b1-a5f2-ad002af6b122\") " Dec 01 20:01:54 crc kubenswrapper[4802]: I1201 20:01:54.184489 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93d41581-b2d3-42b1-a5f2-ad002af6b122-serving-cert\") pod \"93d41581-b2d3-42b1-a5f2-ad002af6b122\" (UID: \"93d41581-b2d3-42b1-a5f2-ad002af6b122\") " Dec 01 20:01:54 crc kubenswrapper[4802]: I1201 20:01:54.184510 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/93d41581-b2d3-42b1-a5f2-ad002af6b122-proxy-ca-bundles\") pod \"93d41581-b2d3-42b1-a5f2-ad002af6b122\" (UID: \"93d41581-b2d3-42b1-a5f2-ad002af6b122\") " Dec 01 20:01:54 crc kubenswrapper[4802]: I1201 20:01:54.184613 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93d41581-b2d3-42b1-a5f2-ad002af6b122-config\") pod \"93d41581-b2d3-42b1-a5f2-ad002af6b122\" (UID: \"93d41581-b2d3-42b1-a5f2-ad002af6b122\") " Dec 01 20:01:54 crc kubenswrapper[4802]: I1201 20:01:54.184631 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/93d41581-b2d3-42b1-a5f2-ad002af6b122-client-ca\") pod \"93d41581-b2d3-42b1-a5f2-ad002af6b122\" (UID: \"93d41581-b2d3-42b1-a5f2-ad002af6b122\") " Dec 01 20:01:54 crc kubenswrapper[4802]: I1201 20:01:54.185386 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93d41581-b2d3-42b1-a5f2-ad002af6b122-client-ca" (OuterVolumeSpecName: "client-ca") pod "93d41581-b2d3-42b1-a5f2-ad002af6b122" (UID: "93d41581-b2d3-42b1-a5f2-ad002af6b122"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:01:54 crc kubenswrapper[4802]: I1201 20:01:54.185419 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93d41581-b2d3-42b1-a5f2-ad002af6b122-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "93d41581-b2d3-42b1-a5f2-ad002af6b122" (UID: "93d41581-b2d3-42b1-a5f2-ad002af6b122"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:01:54 crc kubenswrapper[4802]: I1201 20:01:54.185538 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93d41581-b2d3-42b1-a5f2-ad002af6b122-config" (OuterVolumeSpecName: "config") pod "93d41581-b2d3-42b1-a5f2-ad002af6b122" (UID: "93d41581-b2d3-42b1-a5f2-ad002af6b122"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:01:54 crc kubenswrapper[4802]: I1201 20:01:54.190148 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93d41581-b2d3-42b1-a5f2-ad002af6b122-kube-api-access-nvpkj" (OuterVolumeSpecName: "kube-api-access-nvpkj") pod "93d41581-b2d3-42b1-a5f2-ad002af6b122" (UID: "93d41581-b2d3-42b1-a5f2-ad002af6b122"). InnerVolumeSpecName "kube-api-access-nvpkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:01:54 crc kubenswrapper[4802]: I1201 20:01:54.190575 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93d41581-b2d3-42b1-a5f2-ad002af6b122-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "93d41581-b2d3-42b1-a5f2-ad002af6b122" (UID: "93d41581-b2d3-42b1-a5f2-ad002af6b122"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:01:54 crc kubenswrapper[4802]: I1201 20:01:54.286030 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93d41581-b2d3-42b1-a5f2-ad002af6b122-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:54 crc kubenswrapper[4802]: I1201 20:01:54.286072 4802 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/93d41581-b2d3-42b1-a5f2-ad002af6b122-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:54 crc kubenswrapper[4802]: I1201 20:01:54.286083 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvpkj\" (UniqueName: \"kubernetes.io/projected/93d41581-b2d3-42b1-a5f2-ad002af6b122-kube-api-access-nvpkj\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:54 crc kubenswrapper[4802]: I1201 20:01:54.286093 4802 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93d41581-b2d3-42b1-a5f2-ad002af6b122-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:54 crc kubenswrapper[4802]: I1201 20:01:54.286101 4802 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/93d41581-b2d3-42b1-a5f2-ad002af6b122-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 20:01:54 crc kubenswrapper[4802]: I1201 20:01:54.438005 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw"] Dec 01 20:01:54 crc kubenswrapper[4802]: I1201 20:01:54.440962 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7dbb8cdfc9-fzjlw"] Dec 01 20:01:54 crc kubenswrapper[4802]: I1201 20:01:54.727956 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93d41581-b2d3-42b1-a5f2-ad002af6b122" path="/var/lib/kubelet/pods/93d41581-b2d3-42b1-a5f2-ad002af6b122/volumes" Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.211268 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5597f96679-p482t"] Dec 01 20:01:55 crc kubenswrapper[4802]: E1201 20:01:55.211599 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93d41581-b2d3-42b1-a5f2-ad002af6b122" containerName="controller-manager" Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.211615 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="93d41581-b2d3-42b1-a5f2-ad002af6b122" containerName="controller-manager" Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.211748 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="93d41581-b2d3-42b1-a5f2-ad002af6b122" containerName="controller-manager" Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.212300 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5597f96679-p482t" Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.214559 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.214915 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.215069 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.215217 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.215439 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.215648 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.223452 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.230144 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5597f96679-p482t"] Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.298830 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1e973a1-3c95-43da-9343-c808a27307de-serving-cert\") pod \"controller-manager-5597f96679-p482t\" (UID: \"e1e973a1-3c95-43da-9343-c808a27307de\") " pod="openshift-controller-manager/controller-manager-5597f96679-p482t" Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.299021 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1e973a1-3c95-43da-9343-c808a27307de-config\") pod \"controller-manager-5597f96679-p482t\" (UID: \"e1e973a1-3c95-43da-9343-c808a27307de\") " pod="openshift-controller-manager/controller-manager-5597f96679-p482t" Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.299085 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1e973a1-3c95-43da-9343-c808a27307de-proxy-ca-bundles\") pod \"controller-manager-5597f96679-p482t\" (UID: \"e1e973a1-3c95-43da-9343-c808a27307de\") " pod="openshift-controller-manager/controller-manager-5597f96679-p482t" Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.299122 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e1e973a1-3c95-43da-9343-c808a27307de-client-ca\") pod \"controller-manager-5597f96679-p482t\" (UID: \"e1e973a1-3c95-43da-9343-c808a27307de\") " pod="openshift-controller-manager/controller-manager-5597f96679-p482t" Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.299159 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfxzg\" (UniqueName: \"kubernetes.io/projected/e1e973a1-3c95-43da-9343-c808a27307de-kube-api-access-kfxzg\") pod \"controller-manager-5597f96679-p482t\" (UID: \"e1e973a1-3c95-43da-9343-c808a27307de\") " pod="openshift-controller-manager/controller-manager-5597f96679-p482t" Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.400876 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfxzg\" (UniqueName: \"kubernetes.io/projected/e1e973a1-3c95-43da-9343-c808a27307de-kube-api-access-kfxzg\") pod \"controller-manager-5597f96679-p482t\" (UID: \"e1e973a1-3c95-43da-9343-c808a27307de\") " pod="openshift-controller-manager/controller-manager-5597f96679-p482t" Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.401429 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1e973a1-3c95-43da-9343-c808a27307de-serving-cert\") pod \"controller-manager-5597f96679-p482t\" (UID: \"e1e973a1-3c95-43da-9343-c808a27307de\") " pod="openshift-controller-manager/controller-manager-5597f96679-p482t" Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.401654 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1e973a1-3c95-43da-9343-c808a27307de-config\") pod \"controller-manager-5597f96679-p482t\" (UID: \"e1e973a1-3c95-43da-9343-c808a27307de\") " pod="openshift-controller-manager/controller-manager-5597f96679-p482t" Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.401821 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1e973a1-3c95-43da-9343-c808a27307de-proxy-ca-bundles\") pod \"controller-manager-5597f96679-p482t\" (UID: \"e1e973a1-3c95-43da-9343-c808a27307de\") " pod="openshift-controller-manager/controller-manager-5597f96679-p482t" Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.401970 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e1e973a1-3c95-43da-9343-c808a27307de-client-ca\") pod \"controller-manager-5597f96679-p482t\" (UID: \"e1e973a1-3c95-43da-9343-c808a27307de\") " pod="openshift-controller-manager/controller-manager-5597f96679-p482t" Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.403226 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e1e973a1-3c95-43da-9343-c808a27307de-client-ca\") pod \"controller-manager-5597f96679-p482t\" (UID: \"e1e973a1-3c95-43da-9343-c808a27307de\") " pod="openshift-controller-manager/controller-manager-5597f96679-p482t" Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.403747 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1e973a1-3c95-43da-9343-c808a27307de-proxy-ca-bundles\") pod \"controller-manager-5597f96679-p482t\" (UID: \"e1e973a1-3c95-43da-9343-c808a27307de\") " pod="openshift-controller-manager/controller-manager-5597f96679-p482t" Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.403886 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1e973a1-3c95-43da-9343-c808a27307de-config\") pod \"controller-manager-5597f96679-p482t\" (UID: \"e1e973a1-3c95-43da-9343-c808a27307de\") " pod="openshift-controller-manager/controller-manager-5597f96679-p482t" Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.410529 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1e973a1-3c95-43da-9343-c808a27307de-serving-cert\") pod \"controller-manager-5597f96679-p482t\" (UID: \"e1e973a1-3c95-43da-9343-c808a27307de\") " pod="openshift-controller-manager/controller-manager-5597f96679-p482t" Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.419907 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfxzg\" (UniqueName: \"kubernetes.io/projected/e1e973a1-3c95-43da-9343-c808a27307de-kube-api-access-kfxzg\") pod \"controller-manager-5597f96679-p482t\" (UID: \"e1e973a1-3c95-43da-9343-c808a27307de\") " pod="openshift-controller-manager/controller-manager-5597f96679-p482t" Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.530318 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5597f96679-p482t" Dec 01 20:01:55 crc kubenswrapper[4802]: I1201 20:01:55.953164 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5597f96679-p482t"] Dec 01 20:01:56 crc kubenswrapper[4802]: I1201 20:01:56.128242 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5597f96679-p482t" event={"ID":"e1e973a1-3c95-43da-9343-c808a27307de","Type":"ContainerStarted","Data":"ef14026af100c5a5e3337521acb1fb3501701c1ea0efdc05ba7cc6de6abdf4c3"} Dec 01 20:01:56 crc kubenswrapper[4802]: I1201 20:01:56.128605 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5597f96679-p482t" event={"ID":"e1e973a1-3c95-43da-9343-c808a27307de","Type":"ContainerStarted","Data":"78e092b11411f452b834180249de1fed39d2f1cc0c0cb775c07ce61f614ce6af"} Dec 01 20:01:56 crc kubenswrapper[4802]: I1201 20:01:56.128629 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5597f96679-p482t" Dec 01 20:01:56 crc kubenswrapper[4802]: I1201 20:01:56.130792 4802 patch_prober.go:28] interesting pod/controller-manager-5597f96679-p482t container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" start-of-body= Dec 01 20:01:56 crc kubenswrapper[4802]: I1201 20:01:56.130841 4802 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5597f96679-p482t" podUID="e1e973a1-3c95-43da-9343-c808a27307de" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" Dec 01 20:01:56 crc kubenswrapper[4802]: I1201 20:01:56.147936 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5597f96679-p482t" podStartSLOduration=3.147918339 podStartE2EDuration="3.147918339s" podCreationTimestamp="2025-12-01 20:01:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:01:56.144032336 +0000 UTC m=+337.706591987" watchObservedRunningTime="2025-12-01 20:01:56.147918339 +0000 UTC m=+337.710477980" Dec 01 20:01:57 crc kubenswrapper[4802]: I1201 20:01:57.136841 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5597f96679-p482t" Dec 01 20:02:01 crc kubenswrapper[4802]: I1201 20:02:01.634137 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-f9rgf" Dec 01 20:02:01 crc kubenswrapper[4802]: I1201 20:02:01.702372 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zqg5r"] Dec 01 20:02:02 crc kubenswrapper[4802]: I1201 20:02:02.965247 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bph4q"] Dec 01 20:02:02 crc kubenswrapper[4802]: I1201 20:02:02.966727 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bph4q" podUID="1ef3009d-6227-4034-8325-544c3386a9fd" containerName="registry-server" containerID="cri-o://9ef3834a416117100c9587e11c664c3b1747119fe77572267b8a4462cee75ad7" gracePeriod=30 Dec 01 20:02:02 crc kubenswrapper[4802]: I1201 20:02:02.978812 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dtjsm"] Dec 01 20:02:02 crc kubenswrapper[4802]: I1201 20:02:02.979117 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dtjsm" podUID="3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b" containerName="registry-server" containerID="cri-o://5ed346a0fb4e785a994dbd96f916ee29daa2c05704b3ad7d215880bfb90559f1" gracePeriod=30 Dec 01 20:02:02 crc kubenswrapper[4802]: I1201 20:02:02.994328 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2pb67"] Dec 01 20:02:02 crc kubenswrapper[4802]: I1201 20:02:02.994625 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-2pb67" podUID="0953c320-4dd8-4914-a84d-01bf5e9f11aa" containerName="marketplace-operator" containerID="cri-o://55d95a4424fd1ba662cdc14de1df516e788799a714524c93676d9f5d58cef4dd" gracePeriod=30 Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.005763 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k9p7x"] Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.006017 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k9p7x" podUID="d4f1a65e-833b-4450-a3b1-7d4f67f05fb5" containerName="registry-server" containerID="cri-o://cfee11450d517ccde8729166af1baf922ff181cf5ded64da4732e5bf8687e15e" gracePeriod=30 Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.018729 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fpwgt"] Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.019897 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fpwgt" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.023756 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c6vvc"] Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.023921 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c6vvc" podUID="2f8519d9-5b33-4c4d-b430-6497b8bcc71b" containerName="registry-server" containerID="cri-o://5697dd74be08b6b68fb9fb225ea44b1781a2eb0e0429ac647f728fee279a3933" gracePeriod=30 Dec 01 20:02:03 crc kubenswrapper[4802]: E1201 20:02:03.026539 4802 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ed346a0fb4e785a994dbd96f916ee29daa2c05704b3ad7d215880bfb90559f1 is running failed: container process not found" containerID="5ed346a0fb4e785a994dbd96f916ee29daa2c05704b3ad7d215880bfb90559f1" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.029056 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fpwgt"] Dec 01 20:02:03 crc kubenswrapper[4802]: E1201 20:02:03.030516 4802 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ed346a0fb4e785a994dbd96f916ee29daa2c05704b3ad7d215880bfb90559f1 is running failed: container process not found" containerID="5ed346a0fb4e785a994dbd96f916ee29daa2c05704b3ad7d215880bfb90559f1" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 20:02:03 crc kubenswrapper[4802]: E1201 20:02:03.031151 4802 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ed346a0fb4e785a994dbd96f916ee29daa2c05704b3ad7d215880bfb90559f1 is running failed: container process not found" containerID="5ed346a0fb4e785a994dbd96f916ee29daa2c05704b3ad7d215880bfb90559f1" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 20:02:03 crc kubenswrapper[4802]: E1201 20:02:03.031211 4802 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5ed346a0fb4e785a994dbd96f916ee29daa2c05704b3ad7d215880bfb90559f1 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-dtjsm" podUID="3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b" containerName="registry-server" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.111732 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9688r\" (UniqueName: \"kubernetes.io/projected/4fffad75-c42a-40d4-a2f3-d770091b01fa-kube-api-access-9688r\") pod \"marketplace-operator-79b997595-fpwgt\" (UID: \"4fffad75-c42a-40d4-a2f3-d770091b01fa\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpwgt" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.111774 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fffad75-c42a-40d4-a2f3-d770091b01fa-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fpwgt\" (UID: \"4fffad75-c42a-40d4-a2f3-d770091b01fa\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpwgt" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.111851 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4fffad75-c42a-40d4-a2f3-d770091b01fa-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fpwgt\" (UID: \"4fffad75-c42a-40d4-a2f3-d770091b01fa\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpwgt" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.204180 4802 generic.go:334] "Generic (PLEG): container finished" podID="3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b" containerID="5ed346a0fb4e785a994dbd96f916ee29daa2c05704b3ad7d215880bfb90559f1" exitCode=0 Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.204229 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtjsm" event={"ID":"3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b","Type":"ContainerDied","Data":"5ed346a0fb4e785a994dbd96f916ee29daa2c05704b3ad7d215880bfb90559f1"} Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.210387 4802 generic.go:334] "Generic (PLEG): container finished" podID="2f8519d9-5b33-4c4d-b430-6497b8bcc71b" containerID="5697dd74be08b6b68fb9fb225ea44b1781a2eb0e0429ac647f728fee279a3933" exitCode=0 Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.210448 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6vvc" event={"ID":"2f8519d9-5b33-4c4d-b430-6497b8bcc71b","Type":"ContainerDied","Data":"5697dd74be08b6b68fb9fb225ea44b1781a2eb0e0429ac647f728fee279a3933"} Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.212712 4802 generic.go:334] "Generic (PLEG): container finished" podID="0953c320-4dd8-4914-a84d-01bf5e9f11aa" containerID="55d95a4424fd1ba662cdc14de1df516e788799a714524c93676d9f5d58cef4dd" exitCode=0 Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.212800 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2pb67" event={"ID":"0953c320-4dd8-4914-a84d-01bf5e9f11aa","Type":"ContainerDied","Data":"55d95a4424fd1ba662cdc14de1df516e788799a714524c93676d9f5d58cef4dd"} Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.214691 4802 generic.go:334] "Generic (PLEG): container finished" podID="1ef3009d-6227-4034-8325-544c3386a9fd" containerID="9ef3834a416117100c9587e11c664c3b1747119fe77572267b8a4462cee75ad7" exitCode=0 Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.214743 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bph4q" event={"ID":"1ef3009d-6227-4034-8325-544c3386a9fd","Type":"ContainerDied","Data":"9ef3834a416117100c9587e11c664c3b1747119fe77572267b8a4462cee75ad7"} Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.215373 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fffad75-c42a-40d4-a2f3-d770091b01fa-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fpwgt\" (UID: \"4fffad75-c42a-40d4-a2f3-d770091b01fa\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpwgt" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.215400 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9688r\" (UniqueName: \"kubernetes.io/projected/4fffad75-c42a-40d4-a2f3-d770091b01fa-kube-api-access-9688r\") pod \"marketplace-operator-79b997595-fpwgt\" (UID: \"4fffad75-c42a-40d4-a2f3-d770091b01fa\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpwgt" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.215447 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4fffad75-c42a-40d4-a2f3-d770091b01fa-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fpwgt\" (UID: \"4fffad75-c42a-40d4-a2f3-d770091b01fa\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpwgt" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.217840 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fffad75-c42a-40d4-a2f3-d770091b01fa-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fpwgt\" (UID: \"4fffad75-c42a-40d4-a2f3-d770091b01fa\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpwgt" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.222497 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4fffad75-c42a-40d4-a2f3-d770091b01fa-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fpwgt\" (UID: \"4fffad75-c42a-40d4-a2f3-d770091b01fa\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpwgt" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.227674 4802 generic.go:334] "Generic (PLEG): container finished" podID="d4f1a65e-833b-4450-a3b1-7d4f67f05fb5" containerID="cfee11450d517ccde8729166af1baf922ff181cf5ded64da4732e5bf8687e15e" exitCode=0 Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.227765 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k9p7x" event={"ID":"d4f1a65e-833b-4450-a3b1-7d4f67f05fb5","Type":"ContainerDied","Data":"cfee11450d517ccde8729166af1baf922ff181cf5ded64da4732e5bf8687e15e"} Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.233398 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9688r\" (UniqueName: \"kubernetes.io/projected/4fffad75-c42a-40d4-a2f3-d770091b01fa-kube-api-access-9688r\") pod \"marketplace-operator-79b997595-fpwgt\" (UID: \"4fffad75-c42a-40d4-a2f3-d770091b01fa\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpwgt" Dec 01 20:02:03 crc kubenswrapper[4802]: E1201 20:02:03.243440 4802 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9ef3834a416117100c9587e11c664c3b1747119fe77572267b8a4462cee75ad7 is running failed: container process not found" containerID="9ef3834a416117100c9587e11c664c3b1747119fe77572267b8a4462cee75ad7" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 20:02:03 crc kubenswrapper[4802]: E1201 20:02:03.243733 4802 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9ef3834a416117100c9587e11c664c3b1747119fe77572267b8a4462cee75ad7 is running failed: container process not found" containerID="9ef3834a416117100c9587e11c664c3b1747119fe77572267b8a4462cee75ad7" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 20:02:03 crc kubenswrapper[4802]: E1201 20:02:03.244016 4802 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9ef3834a416117100c9587e11c664c3b1747119fe77572267b8a4462cee75ad7 is running failed: container process not found" containerID="9ef3834a416117100c9587e11c664c3b1747119fe77572267b8a4462cee75ad7" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 20:02:03 crc kubenswrapper[4802]: E1201 20:02:03.244053 4802 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9ef3834a416117100c9587e11c664c3b1747119fe77572267b8a4462cee75ad7 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-bph4q" podUID="1ef3009d-6227-4034-8325-544c3386a9fd" containerName="registry-server" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.349937 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fpwgt" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.580018 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bph4q" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.622817 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2pb67" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.662233 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k9p7x" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.666786 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c6vvc" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.683411 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dtjsm" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.720324 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f8519d9-5b33-4c4d-b430-6497b8bcc71b-utilities\") pod \"2f8519d9-5b33-4c4d-b430-6497b8bcc71b\" (UID: \"2f8519d9-5b33-4c4d-b430-6497b8bcc71b\") " Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.720396 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwszf\" (UniqueName: \"kubernetes.io/projected/d4f1a65e-833b-4450-a3b1-7d4f67f05fb5-kube-api-access-mwszf\") pod \"d4f1a65e-833b-4450-a3b1-7d4f67f05fb5\" (UID: \"d4f1a65e-833b-4450-a3b1-7d4f67f05fb5\") " Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.720457 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4f1a65e-833b-4450-a3b1-7d4f67f05fb5-utilities\") pod \"d4f1a65e-833b-4450-a3b1-7d4f67f05fb5\" (UID: \"d4f1a65e-833b-4450-a3b1-7d4f67f05fb5\") " Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.720479 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4f1a65e-833b-4450-a3b1-7d4f67f05fb5-catalog-content\") pod \"d4f1a65e-833b-4450-a3b1-7d4f67f05fb5\" (UID: \"d4f1a65e-833b-4450-a3b1-7d4f67f05fb5\") " Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.720505 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk8bm\" (UniqueName: \"kubernetes.io/projected/0953c320-4dd8-4914-a84d-01bf5e9f11aa-kube-api-access-gk8bm\") pod \"0953c320-4dd8-4914-a84d-01bf5e9f11aa\" (UID: \"0953c320-4dd8-4914-a84d-01bf5e9f11aa\") " Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.720531 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0953c320-4dd8-4914-a84d-01bf5e9f11aa-marketplace-operator-metrics\") pod \"0953c320-4dd8-4914-a84d-01bf5e9f11aa\" (UID: \"0953c320-4dd8-4914-a84d-01bf5e9f11aa\") " Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.720572 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ef3009d-6227-4034-8325-544c3386a9fd-catalog-content\") pod \"1ef3009d-6227-4034-8325-544c3386a9fd\" (UID: \"1ef3009d-6227-4034-8325-544c3386a9fd\") " Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.720633 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndhmb\" (UniqueName: \"kubernetes.io/projected/2f8519d9-5b33-4c4d-b430-6497b8bcc71b-kube-api-access-ndhmb\") pod \"2f8519d9-5b33-4c4d-b430-6497b8bcc71b\" (UID: \"2f8519d9-5b33-4c4d-b430-6497b8bcc71b\") " Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.721962 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4f1a65e-833b-4450-a3b1-7d4f67f05fb5-utilities" (OuterVolumeSpecName: "utilities") pod "d4f1a65e-833b-4450-a3b1-7d4f67f05fb5" (UID: "d4f1a65e-833b-4450-a3b1-7d4f67f05fb5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.722643 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f8519d9-5b33-4c4d-b430-6497b8bcc71b-utilities" (OuterVolumeSpecName: "utilities") pod "2f8519d9-5b33-4c4d-b430-6497b8bcc71b" (UID: "2f8519d9-5b33-4c4d-b430-6497b8bcc71b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.724987 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f8519d9-5b33-4c4d-b430-6497b8bcc71b-kube-api-access-ndhmb" (OuterVolumeSpecName: "kube-api-access-ndhmb") pod "2f8519d9-5b33-4c4d-b430-6497b8bcc71b" (UID: "2f8519d9-5b33-4c4d-b430-6497b8bcc71b"). InnerVolumeSpecName "kube-api-access-ndhmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.725773 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ef3009d-6227-4034-8325-544c3386a9fd-kube-api-access-58m2r" (OuterVolumeSpecName: "kube-api-access-58m2r") pod "1ef3009d-6227-4034-8325-544c3386a9fd" (UID: "1ef3009d-6227-4034-8325-544c3386a9fd"). InnerVolumeSpecName "kube-api-access-58m2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.726271 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58m2r\" (UniqueName: \"kubernetes.io/projected/1ef3009d-6227-4034-8325-544c3386a9fd-kube-api-access-58m2r\") pod \"1ef3009d-6227-4034-8325-544c3386a9fd\" (UID: \"1ef3009d-6227-4034-8325-544c3386a9fd\") " Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.726378 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f8519d9-5b33-4c4d-b430-6497b8bcc71b-catalog-content\") pod \"2f8519d9-5b33-4c4d-b430-6497b8bcc71b\" (UID: \"2f8519d9-5b33-4c4d-b430-6497b8bcc71b\") " Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.726445 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ef3009d-6227-4034-8325-544c3386a9fd-utilities\") pod \"1ef3009d-6227-4034-8325-544c3386a9fd\" (UID: \"1ef3009d-6227-4034-8325-544c3386a9fd\") " Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.726476 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0953c320-4dd8-4914-a84d-01bf5e9f11aa-marketplace-trusted-ca\") pod \"0953c320-4dd8-4914-a84d-01bf5e9f11aa\" (UID: \"0953c320-4dd8-4914-a84d-01bf5e9f11aa\") " Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.726835 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f8519d9-5b33-4c4d-b430-6497b8bcc71b-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.726852 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4f1a65e-833b-4450-a3b1-7d4f67f05fb5-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.726862 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndhmb\" (UniqueName: \"kubernetes.io/projected/2f8519d9-5b33-4c4d-b430-6497b8bcc71b-kube-api-access-ndhmb\") on node \"crc\" DevicePath \"\"" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.726872 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58m2r\" (UniqueName: \"kubernetes.io/projected/1ef3009d-6227-4034-8325-544c3386a9fd-kube-api-access-58m2r\") on node \"crc\" DevicePath \"\"" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.727764 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ef3009d-6227-4034-8325-544c3386a9fd-utilities" (OuterVolumeSpecName: "utilities") pod "1ef3009d-6227-4034-8325-544c3386a9fd" (UID: "1ef3009d-6227-4034-8325-544c3386a9fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.728251 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0953c320-4dd8-4914-a84d-01bf5e9f11aa-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "0953c320-4dd8-4914-a84d-01bf5e9f11aa" (UID: "0953c320-4dd8-4914-a84d-01bf5e9f11aa"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.737019 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4f1a65e-833b-4450-a3b1-7d4f67f05fb5-kube-api-access-mwszf" (OuterVolumeSpecName: "kube-api-access-mwszf") pod "d4f1a65e-833b-4450-a3b1-7d4f67f05fb5" (UID: "d4f1a65e-833b-4450-a3b1-7d4f67f05fb5"). InnerVolumeSpecName "kube-api-access-mwszf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.737297 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0953c320-4dd8-4914-a84d-01bf5e9f11aa-kube-api-access-gk8bm" (OuterVolumeSpecName: "kube-api-access-gk8bm") pod "0953c320-4dd8-4914-a84d-01bf5e9f11aa" (UID: "0953c320-4dd8-4914-a84d-01bf5e9f11aa"). InnerVolumeSpecName "kube-api-access-gk8bm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.738334 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4f1a65e-833b-4450-a3b1-7d4f67f05fb5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4f1a65e-833b-4450-a3b1-7d4f67f05fb5" (UID: "d4f1a65e-833b-4450-a3b1-7d4f67f05fb5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.738746 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0953c320-4dd8-4914-a84d-01bf5e9f11aa-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "0953c320-4dd8-4914-a84d-01bf5e9f11aa" (UID: "0953c320-4dd8-4914-a84d-01bf5e9f11aa"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.766352 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ef3009d-6227-4034-8325-544c3386a9fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ef3009d-6227-4034-8325-544c3386a9fd" (UID: "1ef3009d-6227-4034-8325-544c3386a9fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.827711 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b-catalog-content\") pod \"3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b\" (UID: \"3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b\") " Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.827814 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c49j2\" (UniqueName: \"kubernetes.io/projected/3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b-kube-api-access-c49j2\") pod \"3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b\" (UID: \"3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b\") " Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.827839 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b-utilities\") pod \"3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b\" (UID: \"3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b\") " Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.828120 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4f1a65e-833b-4450-a3b1-7d4f67f05fb5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.828153 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk8bm\" (UniqueName: \"kubernetes.io/projected/0953c320-4dd8-4914-a84d-01bf5e9f11aa-kube-api-access-gk8bm\") on node \"crc\" DevicePath \"\"" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.828166 4802 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0953c320-4dd8-4914-a84d-01bf5e9f11aa-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.828175 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ef3009d-6227-4034-8325-544c3386a9fd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.828186 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ef3009d-6227-4034-8325-544c3386a9fd-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.828219 4802 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0953c320-4dd8-4914-a84d-01bf5e9f11aa-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.828233 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwszf\" (UniqueName: \"kubernetes.io/projected/d4f1a65e-833b-4450-a3b1-7d4f67f05fb5-kube-api-access-mwszf\") on node \"crc\" DevicePath \"\"" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.828815 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b-utilities" (OuterVolumeSpecName: "utilities") pod "3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b" (UID: "3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.833908 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b-kube-api-access-c49j2" (OuterVolumeSpecName: "kube-api-access-c49j2") pod "3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b" (UID: "3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b"). InnerVolumeSpecName "kube-api-access-c49j2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.834105 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f8519d9-5b33-4c4d-b430-6497b8bcc71b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f8519d9-5b33-4c4d-b430-6497b8bcc71b" (UID: "2f8519d9-5b33-4c4d-b430-6497b8bcc71b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.880364 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b" (UID: "3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.915410 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fpwgt"] Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.928922 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f8519d9-5b33-4c4d-b430-6497b8bcc71b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.929047 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c49j2\" (UniqueName: \"kubernetes.io/projected/3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b-kube-api-access-c49j2\") on node \"crc\" DevicePath \"\"" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.929122 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 20:02:03 crc kubenswrapper[4802]: I1201 20:02:03.929188 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.233690 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fpwgt" event={"ID":"4fffad75-c42a-40d4-a2f3-d770091b01fa","Type":"ContainerStarted","Data":"2f4407fc4a596b264dd20b98cee31125881d12361994f529b812a0143af48209"} Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.233994 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-fpwgt" Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.234017 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fpwgt" event={"ID":"4fffad75-c42a-40d4-a2f3-d770091b01fa","Type":"ContainerStarted","Data":"de1da5a0e9db41611e30db42c89a84813ed0410c91f3fc666f48af36164ee80f"} Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.235092 4802 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-fpwgt container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.66:8080/healthz\": dial tcp 10.217.0.66:8080: connect: connection refused" start-of-body= Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.235139 4802 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-fpwgt" podUID="4fffad75-c42a-40d4-a2f3-d770091b01fa" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.66:8080/healthz\": dial tcp 10.217.0.66:8080: connect: connection refused" Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.238164 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bph4q" event={"ID":"1ef3009d-6227-4034-8325-544c3386a9fd","Type":"ContainerDied","Data":"f30cd507fc830616a5d986483428a388b3ef834d9469422fdb786ec925f1ef3d"} Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.238193 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bph4q" Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.238221 4802 scope.go:117] "RemoveContainer" containerID="9ef3834a416117100c9587e11c664c3b1747119fe77572267b8a4462cee75ad7" Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.243757 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k9p7x" Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.243750 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k9p7x" event={"ID":"d4f1a65e-833b-4450-a3b1-7d4f67f05fb5","Type":"ContainerDied","Data":"08917d33458c7eeca5a0dbb57719791023cd11df7d4c3c970d16fecfc8495d10"} Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.245970 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtjsm" event={"ID":"3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b","Type":"ContainerDied","Data":"b3cdd24c2dbdc1087a9199b1955ab96f8dff0157c6862a25ddec8109311bd4cf"} Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.246089 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dtjsm" Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.250506 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c6vvc" event={"ID":"2f8519d9-5b33-4c4d-b430-6497b8bcc71b","Type":"ContainerDied","Data":"3775abdab8611631ca62e02c0f19059b5fc373aa000b9fbd2bd189f1e4fef510"} Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.250542 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c6vvc" Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.252060 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2pb67" event={"ID":"0953c320-4dd8-4914-a84d-01bf5e9f11aa","Type":"ContainerDied","Data":"97730183cfc38a36abed6c76a3286d5ce203c7c2e8ff0c8fed8401c1a780ebfc"} Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.252091 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2pb67" Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.262523 4802 scope.go:117] "RemoveContainer" containerID="759da204a667308b56f6747b2ff785161a756a324bc80faa407bfda5756037ad" Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.264352 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-fpwgt" podStartSLOduration=2.264335262 podStartE2EDuration="2.264335262s" podCreationTimestamp="2025-12-01 20:02:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:02:04.252074062 +0000 UTC m=+345.814633723" watchObservedRunningTime="2025-12-01 20:02:04.264335262 +0000 UTC m=+345.826894913" Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.294508 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bph4q"] Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.295463 4802 scope.go:117] "RemoveContainer" containerID="c71015f776d0c2715eb53297e668dfa9ae133de4f484daa7fde1b795280dcb77" Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.299294 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bph4q"] Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.323021 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c6vvc"] Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.325036 4802 scope.go:117] "RemoveContainer" containerID="cfee11450d517ccde8729166af1baf922ff181cf5ded64da4732e5bf8687e15e" Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.332340 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c6vvc"] Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.346394 4802 scope.go:117] "RemoveContainer" containerID="0116d55eba32961f51911f0e9406982b986658788f480f4ec75cf8ef62659ae9" Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.358683 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dtjsm"] Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.364781 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dtjsm"] Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.372283 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k9p7x"] Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.374061 4802 scope.go:117] "RemoveContainer" containerID="5682fdc797da598d1b28821e01bb528ea3345f8e31232802248320950842c3e3" Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.376882 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k9p7x"] Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.380623 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2pb67"] Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.383362 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2pb67"] Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.401627 4802 scope.go:117] "RemoveContainer" containerID="5ed346a0fb4e785a994dbd96f916ee29daa2c05704b3ad7d215880bfb90559f1" Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.411996 4802 scope.go:117] "RemoveContainer" containerID="015b2a2cd71b2686455cd2930b082b4f9b0d357c6b0339a5a8d4dfc897b36b03" Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.424126 4802 scope.go:117] "RemoveContainer" containerID="a30774b380c596a09ae08f4c36a64624a4a1af505e3cc5981980a43710a96c2c" Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.439731 4802 scope.go:117] "RemoveContainer" containerID="5697dd74be08b6b68fb9fb225ea44b1781a2eb0e0429ac647f728fee279a3933" Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.453601 4802 scope.go:117] "RemoveContainer" containerID="f7fb812ae1b0870bb01d3bd227a151bfd3ef8c8d113fd43683b19fd4415c4f67" Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.475261 4802 scope.go:117] "RemoveContainer" containerID="dff4b0a63cec243c64d35c22bf2e4bedab08fe65c4cd04d20b67a7e8905d1fce" Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.488524 4802 scope.go:117] "RemoveContainer" containerID="55d95a4424fd1ba662cdc14de1df516e788799a714524c93676d9f5d58cef4dd" Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.727570 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0953c320-4dd8-4914-a84d-01bf5e9f11aa" path="/var/lib/kubelet/pods/0953c320-4dd8-4914-a84d-01bf5e9f11aa/volumes" Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.728600 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ef3009d-6227-4034-8325-544c3386a9fd" path="/var/lib/kubelet/pods/1ef3009d-6227-4034-8325-544c3386a9fd/volumes" Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.729823 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f8519d9-5b33-4c4d-b430-6497b8bcc71b" path="/var/lib/kubelet/pods/2f8519d9-5b33-4c4d-b430-6497b8bcc71b/volumes" Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.732013 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b" path="/var/lib/kubelet/pods/3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b/volumes" Dec 01 20:02:04 crc kubenswrapper[4802]: I1201 20:02:04.732852 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4f1a65e-833b-4450-a3b1-7d4f67f05fb5" path="/var/lib/kubelet/pods/d4f1a65e-833b-4450-a3b1-7d4f67f05fb5/volumes" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.271454 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-fpwgt" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.575344 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hhc44"] Dec 01 20:02:05 crc kubenswrapper[4802]: E1201 20:02:05.575540 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b" containerName="extract-content" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.575551 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b" containerName="extract-content" Dec 01 20:02:05 crc kubenswrapper[4802]: E1201 20:02:05.575561 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f8519d9-5b33-4c4d-b430-6497b8bcc71b" containerName="registry-server" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.575567 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f8519d9-5b33-4c4d-b430-6497b8bcc71b" containerName="registry-server" Dec 01 20:02:05 crc kubenswrapper[4802]: E1201 20:02:05.575575 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef3009d-6227-4034-8325-544c3386a9fd" containerName="registry-server" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.575582 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef3009d-6227-4034-8325-544c3386a9fd" containerName="registry-server" Dec 01 20:02:05 crc kubenswrapper[4802]: E1201 20:02:05.575593 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0953c320-4dd8-4914-a84d-01bf5e9f11aa" containerName="marketplace-operator" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.575600 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="0953c320-4dd8-4914-a84d-01bf5e9f11aa" containerName="marketplace-operator" Dec 01 20:02:05 crc kubenswrapper[4802]: E1201 20:02:05.575608 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f8519d9-5b33-4c4d-b430-6497b8bcc71b" containerName="extract-utilities" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.575614 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f8519d9-5b33-4c4d-b430-6497b8bcc71b" containerName="extract-utilities" Dec 01 20:02:05 crc kubenswrapper[4802]: E1201 20:02:05.575626 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef3009d-6227-4034-8325-544c3386a9fd" containerName="extract-content" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.575632 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef3009d-6227-4034-8325-544c3386a9fd" containerName="extract-content" Dec 01 20:02:05 crc kubenswrapper[4802]: E1201 20:02:05.575639 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f8519d9-5b33-4c4d-b430-6497b8bcc71b" containerName="extract-content" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.575645 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f8519d9-5b33-4c4d-b430-6497b8bcc71b" containerName="extract-content" Dec 01 20:02:05 crc kubenswrapper[4802]: E1201 20:02:05.575654 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b" containerName="extract-utilities" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.575660 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b" containerName="extract-utilities" Dec 01 20:02:05 crc kubenswrapper[4802]: E1201 20:02:05.575667 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b" containerName="registry-server" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.575672 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b" containerName="registry-server" Dec 01 20:02:05 crc kubenswrapper[4802]: E1201 20:02:05.575680 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4f1a65e-833b-4450-a3b1-7d4f67f05fb5" containerName="extract-utilities" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.575685 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4f1a65e-833b-4450-a3b1-7d4f67f05fb5" containerName="extract-utilities" Dec 01 20:02:05 crc kubenswrapper[4802]: E1201 20:02:05.575696 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef3009d-6227-4034-8325-544c3386a9fd" containerName="extract-utilities" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.575701 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef3009d-6227-4034-8325-544c3386a9fd" containerName="extract-utilities" Dec 01 20:02:05 crc kubenswrapper[4802]: E1201 20:02:05.575710 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4f1a65e-833b-4450-a3b1-7d4f67f05fb5" containerName="extract-content" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.575716 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4f1a65e-833b-4450-a3b1-7d4f67f05fb5" containerName="extract-content" Dec 01 20:02:05 crc kubenswrapper[4802]: E1201 20:02:05.575725 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4f1a65e-833b-4450-a3b1-7d4f67f05fb5" containerName="registry-server" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.575730 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4f1a65e-833b-4450-a3b1-7d4f67f05fb5" containerName="registry-server" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.575820 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4f1a65e-833b-4450-a3b1-7d4f67f05fb5" containerName="registry-server" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.575831 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f0c2e2b-e4b4-4733-831f-1a4f3e90b57b" containerName="registry-server" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.575840 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ef3009d-6227-4034-8325-544c3386a9fd" containerName="registry-server" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.575849 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f8519d9-5b33-4c4d-b430-6497b8bcc71b" containerName="registry-server" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.575859 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="0953c320-4dd8-4914-a84d-01bf5e9f11aa" containerName="marketplace-operator" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.576505 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hhc44" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.578922 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.585006 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hhc44"] Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.653561 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3118d07d-4c45-4759-b2a0-792e5f4ca0fc-catalog-content\") pod \"redhat-operators-hhc44\" (UID: \"3118d07d-4c45-4759-b2a0-792e5f4ca0fc\") " pod="openshift-marketplace/redhat-operators-hhc44" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.653671 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3118d07d-4c45-4759-b2a0-792e5f4ca0fc-utilities\") pod \"redhat-operators-hhc44\" (UID: \"3118d07d-4c45-4759-b2a0-792e5f4ca0fc\") " pod="openshift-marketplace/redhat-operators-hhc44" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.653705 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lqp9\" (UniqueName: \"kubernetes.io/projected/3118d07d-4c45-4759-b2a0-792e5f4ca0fc-kube-api-access-8lqp9\") pod \"redhat-operators-hhc44\" (UID: \"3118d07d-4c45-4759-b2a0-792e5f4ca0fc\") " pod="openshift-marketplace/redhat-operators-hhc44" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.754557 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3118d07d-4c45-4759-b2a0-792e5f4ca0fc-catalog-content\") pod \"redhat-operators-hhc44\" (UID: \"3118d07d-4c45-4759-b2a0-792e5f4ca0fc\") " pod="openshift-marketplace/redhat-operators-hhc44" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.754626 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3118d07d-4c45-4759-b2a0-792e5f4ca0fc-utilities\") pod \"redhat-operators-hhc44\" (UID: \"3118d07d-4c45-4759-b2a0-792e5f4ca0fc\") " pod="openshift-marketplace/redhat-operators-hhc44" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.754648 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lqp9\" (UniqueName: \"kubernetes.io/projected/3118d07d-4c45-4759-b2a0-792e5f4ca0fc-kube-api-access-8lqp9\") pod \"redhat-operators-hhc44\" (UID: \"3118d07d-4c45-4759-b2a0-792e5f4ca0fc\") " pod="openshift-marketplace/redhat-operators-hhc44" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.755357 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3118d07d-4c45-4759-b2a0-792e5f4ca0fc-utilities\") pod \"redhat-operators-hhc44\" (UID: \"3118d07d-4c45-4759-b2a0-792e5f4ca0fc\") " pod="openshift-marketplace/redhat-operators-hhc44" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.755392 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3118d07d-4c45-4759-b2a0-792e5f4ca0fc-catalog-content\") pod \"redhat-operators-hhc44\" (UID: \"3118d07d-4c45-4759-b2a0-792e5f4ca0fc\") " pod="openshift-marketplace/redhat-operators-hhc44" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.775437 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lqp9\" (UniqueName: \"kubernetes.io/projected/3118d07d-4c45-4759-b2a0-792e5f4ca0fc-kube-api-access-8lqp9\") pod \"redhat-operators-hhc44\" (UID: \"3118d07d-4c45-4759-b2a0-792e5f4ca0fc\") " pod="openshift-marketplace/redhat-operators-hhc44" Dec 01 20:02:05 crc kubenswrapper[4802]: I1201 20:02:05.895091 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hhc44" Dec 01 20:02:06 crc kubenswrapper[4802]: I1201 20:02:06.290269 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hhc44"] Dec 01 20:02:06 crc kubenswrapper[4802]: W1201 20:02:06.294927 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3118d07d_4c45_4759_b2a0_792e5f4ca0fc.slice/crio-f68a3dd8ca1ef9805cea9837e7e8c26b15bece6cca3b4ae3e929bf7975e8ecb8 WatchSource:0}: Error finding container f68a3dd8ca1ef9805cea9837e7e8c26b15bece6cca3b4ae3e929bf7975e8ecb8: Status 404 returned error can't find the container with id f68a3dd8ca1ef9805cea9837e7e8c26b15bece6cca3b4ae3e929bf7975e8ecb8 Dec 01 20:02:06 crc kubenswrapper[4802]: I1201 20:02:06.575013 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q86qm"] Dec 01 20:02:06 crc kubenswrapper[4802]: I1201 20:02:06.576526 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q86qm" Dec 01 20:02:06 crc kubenswrapper[4802]: I1201 20:02:06.579304 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 20:02:06 crc kubenswrapper[4802]: I1201 20:02:06.587673 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q86qm"] Dec 01 20:02:06 crc kubenswrapper[4802]: I1201 20:02:06.666597 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3b32805-cd2f-493c-a2be-2d993678cd06-catalog-content\") pod \"certified-operators-q86qm\" (UID: \"a3b32805-cd2f-493c-a2be-2d993678cd06\") " pod="openshift-marketplace/certified-operators-q86qm" Dec 01 20:02:06 crc kubenswrapper[4802]: I1201 20:02:06.666661 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3b32805-cd2f-493c-a2be-2d993678cd06-utilities\") pod \"certified-operators-q86qm\" (UID: \"a3b32805-cd2f-493c-a2be-2d993678cd06\") " pod="openshift-marketplace/certified-operators-q86qm" Dec 01 20:02:06 crc kubenswrapper[4802]: I1201 20:02:06.666749 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf879\" (UniqueName: \"kubernetes.io/projected/a3b32805-cd2f-493c-a2be-2d993678cd06-kube-api-access-hf879\") pod \"certified-operators-q86qm\" (UID: \"a3b32805-cd2f-493c-a2be-2d993678cd06\") " pod="openshift-marketplace/certified-operators-q86qm" Dec 01 20:02:06 crc kubenswrapper[4802]: I1201 20:02:06.768239 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf879\" (UniqueName: \"kubernetes.io/projected/a3b32805-cd2f-493c-a2be-2d993678cd06-kube-api-access-hf879\") pod \"certified-operators-q86qm\" (UID: \"a3b32805-cd2f-493c-a2be-2d993678cd06\") " pod="openshift-marketplace/certified-operators-q86qm" Dec 01 20:02:06 crc kubenswrapper[4802]: I1201 20:02:06.768324 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3b32805-cd2f-493c-a2be-2d993678cd06-catalog-content\") pod \"certified-operators-q86qm\" (UID: \"a3b32805-cd2f-493c-a2be-2d993678cd06\") " pod="openshift-marketplace/certified-operators-q86qm" Dec 01 20:02:06 crc kubenswrapper[4802]: I1201 20:02:06.768344 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3b32805-cd2f-493c-a2be-2d993678cd06-utilities\") pod \"certified-operators-q86qm\" (UID: \"a3b32805-cd2f-493c-a2be-2d993678cd06\") " pod="openshift-marketplace/certified-operators-q86qm" Dec 01 20:02:06 crc kubenswrapper[4802]: I1201 20:02:06.768896 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3b32805-cd2f-493c-a2be-2d993678cd06-utilities\") pod \"certified-operators-q86qm\" (UID: \"a3b32805-cd2f-493c-a2be-2d993678cd06\") " pod="openshift-marketplace/certified-operators-q86qm" Dec 01 20:02:06 crc kubenswrapper[4802]: I1201 20:02:06.768991 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3b32805-cd2f-493c-a2be-2d993678cd06-catalog-content\") pod \"certified-operators-q86qm\" (UID: \"a3b32805-cd2f-493c-a2be-2d993678cd06\") " pod="openshift-marketplace/certified-operators-q86qm" Dec 01 20:02:06 crc kubenswrapper[4802]: I1201 20:02:06.791002 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf879\" (UniqueName: \"kubernetes.io/projected/a3b32805-cd2f-493c-a2be-2d993678cd06-kube-api-access-hf879\") pod \"certified-operators-q86qm\" (UID: \"a3b32805-cd2f-493c-a2be-2d993678cd06\") " pod="openshift-marketplace/certified-operators-q86qm" Dec 01 20:02:06 crc kubenswrapper[4802]: I1201 20:02:06.900621 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q86qm" Dec 01 20:02:07 crc kubenswrapper[4802]: I1201 20:02:07.284878 4802 generic.go:334] "Generic (PLEG): container finished" podID="3118d07d-4c45-4759-b2a0-792e5f4ca0fc" containerID="c39d9fe9ee329f245e0515abe6e888cf440742e1b9a73585a948f70f78080004" exitCode=0 Dec 01 20:02:07 crc kubenswrapper[4802]: I1201 20:02:07.284952 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hhc44" event={"ID":"3118d07d-4c45-4759-b2a0-792e5f4ca0fc","Type":"ContainerDied","Data":"c39d9fe9ee329f245e0515abe6e888cf440742e1b9a73585a948f70f78080004"} Dec 01 20:02:07 crc kubenswrapper[4802]: I1201 20:02:07.285406 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hhc44" event={"ID":"3118d07d-4c45-4759-b2a0-792e5f4ca0fc","Type":"ContainerStarted","Data":"f68a3dd8ca1ef9805cea9837e7e8c26b15bece6cca3b4ae3e929bf7975e8ecb8"} Dec 01 20:02:07 crc kubenswrapper[4802]: I1201 20:02:07.313400 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q86qm"] Dec 01 20:02:07 crc kubenswrapper[4802]: W1201 20:02:07.317131 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3b32805_cd2f_493c_a2be_2d993678cd06.slice/crio-63b32829da62203d827040636afa5a6daa5d48d61dd20f7d974f8348f570c1a5 WatchSource:0}: Error finding container 63b32829da62203d827040636afa5a6daa5d48d61dd20f7d974f8348f570c1a5: Status 404 returned error can't find the container with id 63b32829da62203d827040636afa5a6daa5d48d61dd20f7d974f8348f570c1a5 Dec 01 20:02:07 crc kubenswrapper[4802]: I1201 20:02:07.968671 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wcdzr"] Dec 01 20:02:07 crc kubenswrapper[4802]: I1201 20:02:07.970053 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wcdzr" Dec 01 20:02:07 crc kubenswrapper[4802]: I1201 20:02:07.972063 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 20:02:07 crc kubenswrapper[4802]: I1201 20:02:07.980980 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wcdzr"] Dec 01 20:02:08 crc kubenswrapper[4802]: I1201 20:02:08.086136 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w9d9\" (UniqueName: \"kubernetes.io/projected/4f8205a1-38e0-47e3-be8f-5add6cdac5cb-kube-api-access-9w9d9\") pod \"community-operators-wcdzr\" (UID: \"4f8205a1-38e0-47e3-be8f-5add6cdac5cb\") " pod="openshift-marketplace/community-operators-wcdzr" Dec 01 20:02:08 crc kubenswrapper[4802]: I1201 20:02:08.086612 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f8205a1-38e0-47e3-be8f-5add6cdac5cb-catalog-content\") pod \"community-operators-wcdzr\" (UID: \"4f8205a1-38e0-47e3-be8f-5add6cdac5cb\") " pod="openshift-marketplace/community-operators-wcdzr" Dec 01 20:02:08 crc kubenswrapper[4802]: I1201 20:02:08.086882 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f8205a1-38e0-47e3-be8f-5add6cdac5cb-utilities\") pod \"community-operators-wcdzr\" (UID: \"4f8205a1-38e0-47e3-be8f-5add6cdac5cb\") " pod="openshift-marketplace/community-operators-wcdzr" Dec 01 20:02:08 crc kubenswrapper[4802]: I1201 20:02:08.188113 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w9d9\" (UniqueName: \"kubernetes.io/projected/4f8205a1-38e0-47e3-be8f-5add6cdac5cb-kube-api-access-9w9d9\") pod \"community-operators-wcdzr\" (UID: \"4f8205a1-38e0-47e3-be8f-5add6cdac5cb\") " pod="openshift-marketplace/community-operators-wcdzr" Dec 01 20:02:08 crc kubenswrapper[4802]: I1201 20:02:08.188271 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f8205a1-38e0-47e3-be8f-5add6cdac5cb-catalog-content\") pod \"community-operators-wcdzr\" (UID: \"4f8205a1-38e0-47e3-be8f-5add6cdac5cb\") " pod="openshift-marketplace/community-operators-wcdzr" Dec 01 20:02:08 crc kubenswrapper[4802]: I1201 20:02:08.188315 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f8205a1-38e0-47e3-be8f-5add6cdac5cb-utilities\") pod \"community-operators-wcdzr\" (UID: \"4f8205a1-38e0-47e3-be8f-5add6cdac5cb\") " pod="openshift-marketplace/community-operators-wcdzr" Dec 01 20:02:08 crc kubenswrapper[4802]: I1201 20:02:08.189349 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f8205a1-38e0-47e3-be8f-5add6cdac5cb-utilities\") pod \"community-operators-wcdzr\" (UID: \"4f8205a1-38e0-47e3-be8f-5add6cdac5cb\") " pod="openshift-marketplace/community-operators-wcdzr" Dec 01 20:02:08 crc kubenswrapper[4802]: I1201 20:02:08.189373 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f8205a1-38e0-47e3-be8f-5add6cdac5cb-catalog-content\") pod \"community-operators-wcdzr\" (UID: \"4f8205a1-38e0-47e3-be8f-5add6cdac5cb\") " pod="openshift-marketplace/community-operators-wcdzr" Dec 01 20:02:08 crc kubenswrapper[4802]: I1201 20:02:08.209824 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w9d9\" (UniqueName: \"kubernetes.io/projected/4f8205a1-38e0-47e3-be8f-5add6cdac5cb-kube-api-access-9w9d9\") pod \"community-operators-wcdzr\" (UID: \"4f8205a1-38e0-47e3-be8f-5add6cdac5cb\") " pod="openshift-marketplace/community-operators-wcdzr" Dec 01 20:02:08 crc kubenswrapper[4802]: I1201 20:02:08.286661 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wcdzr" Dec 01 20:02:08 crc kubenswrapper[4802]: I1201 20:02:08.295811 4802 generic.go:334] "Generic (PLEG): container finished" podID="a3b32805-cd2f-493c-a2be-2d993678cd06" containerID="0e4caad5ab02172ec962788f9ca524f68c5f9739b7f38ea52057d1bd5e8536c6" exitCode=0 Dec 01 20:02:08 crc kubenswrapper[4802]: I1201 20:02:08.295880 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q86qm" event={"ID":"a3b32805-cd2f-493c-a2be-2d993678cd06","Type":"ContainerDied","Data":"0e4caad5ab02172ec962788f9ca524f68c5f9739b7f38ea52057d1bd5e8536c6"} Dec 01 20:02:08 crc kubenswrapper[4802]: I1201 20:02:08.295920 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q86qm" event={"ID":"a3b32805-cd2f-493c-a2be-2d993678cd06","Type":"ContainerStarted","Data":"63b32829da62203d827040636afa5a6daa5d48d61dd20f7d974f8348f570c1a5"} Dec 01 20:02:08 crc kubenswrapper[4802]: I1201 20:02:08.675239 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wcdzr"] Dec 01 20:02:08 crc kubenswrapper[4802]: I1201 20:02:08.974542 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fgxlr"] Dec 01 20:02:08 crc kubenswrapper[4802]: I1201 20:02:08.993022 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgxlr"] Dec 01 20:02:08 crc kubenswrapper[4802]: I1201 20:02:08.993289 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fgxlr" Dec 01 20:02:09 crc kubenswrapper[4802]: I1201 20:02:09.000735 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 20:02:09 crc kubenswrapper[4802]: I1201 20:02:09.100000 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdt2p\" (UniqueName: \"kubernetes.io/projected/038eb692-9117-4953-94a0-420941ce4b7a-kube-api-access-rdt2p\") pod \"redhat-marketplace-fgxlr\" (UID: \"038eb692-9117-4953-94a0-420941ce4b7a\") " pod="openshift-marketplace/redhat-marketplace-fgxlr" Dec 01 20:02:09 crc kubenswrapper[4802]: I1201 20:02:09.100053 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/038eb692-9117-4953-94a0-420941ce4b7a-utilities\") pod \"redhat-marketplace-fgxlr\" (UID: \"038eb692-9117-4953-94a0-420941ce4b7a\") " pod="openshift-marketplace/redhat-marketplace-fgxlr" Dec 01 20:02:09 crc kubenswrapper[4802]: I1201 20:02:09.100108 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/038eb692-9117-4953-94a0-420941ce4b7a-catalog-content\") pod \"redhat-marketplace-fgxlr\" (UID: \"038eb692-9117-4953-94a0-420941ce4b7a\") " pod="openshift-marketplace/redhat-marketplace-fgxlr" Dec 01 20:02:09 crc kubenswrapper[4802]: I1201 20:02:09.201784 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/038eb692-9117-4953-94a0-420941ce4b7a-catalog-content\") pod \"redhat-marketplace-fgxlr\" (UID: \"038eb692-9117-4953-94a0-420941ce4b7a\") " pod="openshift-marketplace/redhat-marketplace-fgxlr" Dec 01 20:02:09 crc kubenswrapper[4802]: I1201 20:02:09.201929 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdt2p\" (UniqueName: \"kubernetes.io/projected/038eb692-9117-4953-94a0-420941ce4b7a-kube-api-access-rdt2p\") pod \"redhat-marketplace-fgxlr\" (UID: \"038eb692-9117-4953-94a0-420941ce4b7a\") " pod="openshift-marketplace/redhat-marketplace-fgxlr" Dec 01 20:02:09 crc kubenswrapper[4802]: I1201 20:02:09.201966 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/038eb692-9117-4953-94a0-420941ce4b7a-utilities\") pod \"redhat-marketplace-fgxlr\" (UID: \"038eb692-9117-4953-94a0-420941ce4b7a\") " pod="openshift-marketplace/redhat-marketplace-fgxlr" Dec 01 20:02:09 crc kubenswrapper[4802]: I1201 20:02:09.202570 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/038eb692-9117-4953-94a0-420941ce4b7a-utilities\") pod \"redhat-marketplace-fgxlr\" (UID: \"038eb692-9117-4953-94a0-420941ce4b7a\") " pod="openshift-marketplace/redhat-marketplace-fgxlr" Dec 01 20:02:09 crc kubenswrapper[4802]: I1201 20:02:09.202592 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/038eb692-9117-4953-94a0-420941ce4b7a-catalog-content\") pod \"redhat-marketplace-fgxlr\" (UID: \"038eb692-9117-4953-94a0-420941ce4b7a\") " pod="openshift-marketplace/redhat-marketplace-fgxlr" Dec 01 20:02:09 crc kubenswrapper[4802]: I1201 20:02:09.222484 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdt2p\" (UniqueName: \"kubernetes.io/projected/038eb692-9117-4953-94a0-420941ce4b7a-kube-api-access-rdt2p\") pod \"redhat-marketplace-fgxlr\" (UID: \"038eb692-9117-4953-94a0-420941ce4b7a\") " pod="openshift-marketplace/redhat-marketplace-fgxlr" Dec 01 20:02:09 crc kubenswrapper[4802]: I1201 20:02:09.302721 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hhc44" event={"ID":"3118d07d-4c45-4759-b2a0-792e5f4ca0fc","Type":"ContainerStarted","Data":"42c47d01f0c8b8c0b8bcc8013b5a1a9f9563ca0f58a532b597c21b82d405c7c3"} Dec 01 20:02:09 crc kubenswrapper[4802]: I1201 20:02:09.304587 4802 generic.go:334] "Generic (PLEG): container finished" podID="4f8205a1-38e0-47e3-be8f-5add6cdac5cb" containerID="350bc5bd0c48c02a28eec9a04f164629646c1ba130346ad47c23985f8d096260" exitCode=0 Dec 01 20:02:09 crc kubenswrapper[4802]: I1201 20:02:09.304636 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcdzr" event={"ID":"4f8205a1-38e0-47e3-be8f-5add6cdac5cb","Type":"ContainerDied","Data":"350bc5bd0c48c02a28eec9a04f164629646c1ba130346ad47c23985f8d096260"} Dec 01 20:02:09 crc kubenswrapper[4802]: I1201 20:02:09.304654 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcdzr" event={"ID":"4f8205a1-38e0-47e3-be8f-5add6cdac5cb","Type":"ContainerStarted","Data":"530c45107bb47ff089f093e23e729a374762ae13f9372ee4f2130f0a1e9b8bea"} Dec 01 20:02:09 crc kubenswrapper[4802]: I1201 20:02:09.307612 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q86qm" event={"ID":"a3b32805-cd2f-493c-a2be-2d993678cd06","Type":"ContainerStarted","Data":"adabf5aa8a110bf551d0d613069c9aaee0235df510575bda076e218c6aafe1f6"} Dec 01 20:02:09 crc kubenswrapper[4802]: I1201 20:02:09.316389 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fgxlr" Dec 01 20:02:09 crc kubenswrapper[4802]: I1201 20:02:09.704300 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgxlr"] Dec 01 20:02:10 crc kubenswrapper[4802]: I1201 20:02:10.315499 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcdzr" event={"ID":"4f8205a1-38e0-47e3-be8f-5add6cdac5cb","Type":"ContainerStarted","Data":"fbca529217f08b65ce84741a22ec0c94249cbda914bf66f1c1139da203c0d54f"} Dec 01 20:02:10 crc kubenswrapper[4802]: I1201 20:02:10.318769 4802 generic.go:334] "Generic (PLEG): container finished" podID="038eb692-9117-4953-94a0-420941ce4b7a" containerID="c1ceb4cb782b714bd8ac449431107422aed685567dc1ed6d86435dc375e6b0e9" exitCode=0 Dec 01 20:02:10 crc kubenswrapper[4802]: I1201 20:02:10.318841 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgxlr" event={"ID":"038eb692-9117-4953-94a0-420941ce4b7a","Type":"ContainerDied","Data":"c1ceb4cb782b714bd8ac449431107422aed685567dc1ed6d86435dc375e6b0e9"} Dec 01 20:02:10 crc kubenswrapper[4802]: I1201 20:02:10.318866 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgxlr" event={"ID":"038eb692-9117-4953-94a0-420941ce4b7a","Type":"ContainerStarted","Data":"c153c950305a12b9e1bb3cd9d9957b4018dd6b6303400aae20f1a52cb45d41c9"} Dec 01 20:02:10 crc kubenswrapper[4802]: I1201 20:02:10.320569 4802 generic.go:334] "Generic (PLEG): container finished" podID="a3b32805-cd2f-493c-a2be-2d993678cd06" containerID="adabf5aa8a110bf551d0d613069c9aaee0235df510575bda076e218c6aafe1f6" exitCode=0 Dec 01 20:02:10 crc kubenswrapper[4802]: I1201 20:02:10.320620 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q86qm" event={"ID":"a3b32805-cd2f-493c-a2be-2d993678cd06","Type":"ContainerDied","Data":"adabf5aa8a110bf551d0d613069c9aaee0235df510575bda076e218c6aafe1f6"} Dec 01 20:02:10 crc kubenswrapper[4802]: I1201 20:02:10.323619 4802 generic.go:334] "Generic (PLEG): container finished" podID="3118d07d-4c45-4759-b2a0-792e5f4ca0fc" containerID="42c47d01f0c8b8c0b8bcc8013b5a1a9f9563ca0f58a532b597c21b82d405c7c3" exitCode=0 Dec 01 20:02:10 crc kubenswrapper[4802]: I1201 20:02:10.323647 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hhc44" event={"ID":"3118d07d-4c45-4759-b2a0-792e5f4ca0fc","Type":"ContainerDied","Data":"42c47d01f0c8b8c0b8bcc8013b5a1a9f9563ca0f58a532b597c21b82d405c7c3"} Dec 01 20:02:11 crc kubenswrapper[4802]: I1201 20:02:11.331419 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q86qm" event={"ID":"a3b32805-cd2f-493c-a2be-2d993678cd06","Type":"ContainerStarted","Data":"b57c623c5bcac6ff249fdb3e80f0c46e34950a6f02a0daa184c8a2499c0ccbbb"} Dec 01 20:02:11 crc kubenswrapper[4802]: I1201 20:02:11.334053 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hhc44" event={"ID":"3118d07d-4c45-4759-b2a0-792e5f4ca0fc","Type":"ContainerStarted","Data":"569c641ea120c70e47ca199f289f5b095c6650466267fa7a77b805470f4945ee"} Dec 01 20:02:11 crc kubenswrapper[4802]: I1201 20:02:11.335869 4802 generic.go:334] "Generic (PLEG): container finished" podID="4f8205a1-38e0-47e3-be8f-5add6cdac5cb" containerID="fbca529217f08b65ce84741a22ec0c94249cbda914bf66f1c1139da203c0d54f" exitCode=0 Dec 01 20:02:11 crc kubenswrapper[4802]: I1201 20:02:11.335960 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcdzr" event={"ID":"4f8205a1-38e0-47e3-be8f-5add6cdac5cb","Type":"ContainerDied","Data":"fbca529217f08b65ce84741a22ec0c94249cbda914bf66f1c1139da203c0d54f"} Dec 01 20:02:11 crc kubenswrapper[4802]: I1201 20:02:11.337559 4802 generic.go:334] "Generic (PLEG): container finished" podID="038eb692-9117-4953-94a0-420941ce4b7a" containerID="e65b76407a21dc1e91fd165ac699c7db7c6b388ce4f271b7111a9ec0739771f7" exitCode=0 Dec 01 20:02:11 crc kubenswrapper[4802]: I1201 20:02:11.337592 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgxlr" event={"ID":"038eb692-9117-4953-94a0-420941ce4b7a","Type":"ContainerDied","Data":"e65b76407a21dc1e91fd165ac699c7db7c6b388ce4f271b7111a9ec0739771f7"} Dec 01 20:02:11 crc kubenswrapper[4802]: I1201 20:02:11.350251 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q86qm" podStartSLOduration=2.792241941 podStartE2EDuration="5.350235923s" podCreationTimestamp="2025-12-01 20:02:06 +0000 UTC" firstStartedPulling="2025-12-01 20:02:08.299346771 +0000 UTC m=+349.861906452" lastFinishedPulling="2025-12-01 20:02:10.857340793 +0000 UTC m=+352.419900434" observedRunningTime="2025-12-01 20:02:11.349009383 +0000 UTC m=+352.911569024" watchObservedRunningTime="2025-12-01 20:02:11.350235923 +0000 UTC m=+352.912795564" Dec 01 20:02:11 crc kubenswrapper[4802]: I1201 20:02:11.411735 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hhc44" podStartSLOduration=2.569650063 podStartE2EDuration="6.411713327s" podCreationTimestamp="2025-12-01 20:02:05 +0000 UTC" firstStartedPulling="2025-12-01 20:02:07.292216453 +0000 UTC m=+348.854776094" lastFinishedPulling="2025-12-01 20:02:11.134279717 +0000 UTC m=+352.696839358" observedRunningTime="2025-12-01 20:02:11.407129582 +0000 UTC m=+352.969689233" watchObservedRunningTime="2025-12-01 20:02:11.411713327 +0000 UTC m=+352.974272978" Dec 01 20:02:13 crc kubenswrapper[4802]: I1201 20:02:13.349701 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wcdzr" event={"ID":"4f8205a1-38e0-47e3-be8f-5add6cdac5cb","Type":"ContainerStarted","Data":"48ccacb34e8d5384a12401adf921cddf4a11e2f4e020f100c1e9b9120fbd57f2"} Dec 01 20:02:13 crc kubenswrapper[4802]: I1201 20:02:13.351789 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgxlr" event={"ID":"038eb692-9117-4953-94a0-420941ce4b7a","Type":"ContainerStarted","Data":"5ce19a4a7d5f347c24ce02e15752977f8e7fe264a82eed1e7300da59c81539a4"} Dec 01 20:02:13 crc kubenswrapper[4802]: I1201 20:02:13.397995 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wcdzr" podStartSLOduration=3.9869993360000002 podStartE2EDuration="6.397976254s" podCreationTimestamp="2025-12-01 20:02:07 +0000 UTC" firstStartedPulling="2025-12-01 20:02:09.307184722 +0000 UTC m=+350.869744363" lastFinishedPulling="2025-12-01 20:02:11.71816164 +0000 UTC m=+353.280721281" observedRunningTime="2025-12-01 20:02:13.374863079 +0000 UTC m=+354.937422710" watchObservedRunningTime="2025-12-01 20:02:13.397976254 +0000 UTC m=+354.960535885" Dec 01 20:02:13 crc kubenswrapper[4802]: I1201 20:02:13.398115 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fgxlr" podStartSLOduration=3.04817742 podStartE2EDuration="5.398111278s" podCreationTimestamp="2025-12-01 20:02:08 +0000 UTC" firstStartedPulling="2025-12-01 20:02:10.322640154 +0000 UTC m=+351.885199795" lastFinishedPulling="2025-12-01 20:02:12.672574002 +0000 UTC m=+354.235133653" observedRunningTime="2025-12-01 20:02:13.396032121 +0000 UTC m=+354.958591762" watchObservedRunningTime="2025-12-01 20:02:13.398111278 +0000 UTC m=+354.960670919" Dec 01 20:02:15 crc kubenswrapper[4802]: I1201 20:02:15.895779 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hhc44" Dec 01 20:02:15 crc kubenswrapper[4802]: I1201 20:02:15.896100 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hhc44" Dec 01 20:02:16 crc kubenswrapper[4802]: I1201 20:02:16.901567 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q86qm" Dec 01 20:02:16 crc kubenswrapper[4802]: I1201 20:02:16.901913 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q86qm" Dec 01 20:02:16 crc kubenswrapper[4802]: I1201 20:02:16.935677 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q86qm" Dec 01 20:02:16 crc kubenswrapper[4802]: I1201 20:02:16.937340 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hhc44" podUID="3118d07d-4c45-4759-b2a0-792e5f4ca0fc" containerName="registry-server" probeResult="failure" output=< Dec 01 20:02:16 crc kubenswrapper[4802]: timeout: failed to connect service ":50051" within 1s Dec 01 20:02:16 crc kubenswrapper[4802]: > Dec 01 20:02:17 crc kubenswrapper[4802]: I1201 20:02:17.419233 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q86qm" Dec 01 20:02:18 crc kubenswrapper[4802]: I1201 20:02:18.286894 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wcdzr" Dec 01 20:02:18 crc kubenswrapper[4802]: I1201 20:02:18.286967 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wcdzr" Dec 01 20:02:18 crc kubenswrapper[4802]: I1201 20:02:18.326757 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wcdzr" Dec 01 20:02:18 crc kubenswrapper[4802]: I1201 20:02:18.414978 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wcdzr" Dec 01 20:02:19 crc kubenswrapper[4802]: I1201 20:02:19.317380 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fgxlr" Dec 01 20:02:19 crc kubenswrapper[4802]: I1201 20:02:19.318394 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fgxlr" Dec 01 20:02:19 crc kubenswrapper[4802]: I1201 20:02:19.352297 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fgxlr" Dec 01 20:02:19 crc kubenswrapper[4802]: I1201 20:02:19.415041 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fgxlr" Dec 01 20:02:25 crc kubenswrapper[4802]: I1201 20:02:25.952459 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hhc44" Dec 01 20:02:25 crc kubenswrapper[4802]: I1201 20:02:25.987623 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hhc44" Dec 01 20:02:26 crc kubenswrapper[4802]: I1201 20:02:26.748252 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" podUID="e59b4a8c-e6d3-4b2b-900f-0098c1f863f3" containerName="registry" containerID="cri-o://934721271d36aeebf7199d20721d8d0eb114e8be073cb854d67067c6b5b12a09" gracePeriod=30 Dec 01 20:02:28 crc kubenswrapper[4802]: I1201 20:02:28.088663 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:02:28 crc kubenswrapper[4802]: I1201 20:02:28.088765 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:02:29 crc kubenswrapper[4802]: I1201 20:02:29.925708 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.077652 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-installation-pull-secrets\") pod \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.077743 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2rzw\" (UniqueName: \"kubernetes.io/projected/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-kube-api-access-r2rzw\") pod \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.078012 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.078070 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-registry-tls\") pod \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.078125 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-bound-sa-token\") pod \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.078162 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-ca-trust-extracted\") pod \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.078260 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-trusted-ca\") pod \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.078300 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-registry-certificates\") pod \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\" (UID: \"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3\") " Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.079451 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.079746 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.084159 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.088603 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-kube-api-access-r2rzw" (OuterVolumeSpecName: "kube-api-access-r2rzw") pod "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3"). InnerVolumeSpecName "kube-api-access-r2rzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.089847 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.090016 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.095249 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.098498 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3" (UID: "e59b4a8c-e6d3-4b2b-900f-0098c1f863f3"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.179358 4802 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.179399 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2rzw\" (UniqueName: \"kubernetes.io/projected/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-kube-api-access-r2rzw\") on node \"crc\" DevicePath \"\"" Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.179417 4802 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.179433 4802 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.179447 4802 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.179458 4802 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.179469 4802 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.444056 4802 generic.go:334] "Generic (PLEG): container finished" podID="e59b4a8c-e6d3-4b2b-900f-0098c1f863f3" containerID="934721271d36aeebf7199d20721d8d0eb114e8be073cb854d67067c6b5b12a09" exitCode=0 Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.444104 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" event={"ID":"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3","Type":"ContainerDied","Data":"934721271d36aeebf7199d20721d8d0eb114e8be073cb854d67067c6b5b12a09"} Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.444136 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" event={"ID":"e59b4a8c-e6d3-4b2b-900f-0098c1f863f3","Type":"ContainerDied","Data":"7d01374a6bed982cef7a09f5c25dcc57dfb8cbd9e86414dbda92b02a1219e31a"} Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.444156 4802 scope.go:117] "RemoveContainer" containerID="934721271d36aeebf7199d20721d8d0eb114e8be073cb854d67067c6b5b12a09" Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.444174 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zqg5r" Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.458052 4802 scope.go:117] "RemoveContainer" containerID="934721271d36aeebf7199d20721d8d0eb114e8be073cb854d67067c6b5b12a09" Dec 01 20:02:30 crc kubenswrapper[4802]: E1201 20:02:30.458452 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"934721271d36aeebf7199d20721d8d0eb114e8be073cb854d67067c6b5b12a09\": container with ID starting with 934721271d36aeebf7199d20721d8d0eb114e8be073cb854d67067c6b5b12a09 not found: ID does not exist" containerID="934721271d36aeebf7199d20721d8d0eb114e8be073cb854d67067c6b5b12a09" Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.458502 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"934721271d36aeebf7199d20721d8d0eb114e8be073cb854d67067c6b5b12a09"} err="failed to get container status \"934721271d36aeebf7199d20721d8d0eb114e8be073cb854d67067c6b5b12a09\": rpc error: code = NotFound desc = could not find container \"934721271d36aeebf7199d20721d8d0eb114e8be073cb854d67067c6b5b12a09\": container with ID starting with 934721271d36aeebf7199d20721d8d0eb114e8be073cb854d67067c6b5b12a09 not found: ID does not exist" Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.475976 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zqg5r"] Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.478842 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zqg5r"] Dec 01 20:02:30 crc kubenswrapper[4802]: I1201 20:02:30.727900 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e59b4a8c-e6d3-4b2b-900f-0098c1f863f3" path="/var/lib/kubelet/pods/e59b4a8c-e6d3-4b2b-900f-0098c1f863f3/volumes" Dec 01 20:02:33 crc kubenswrapper[4802]: I1201 20:02:33.640886 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj"] Dec 01 20:02:33 crc kubenswrapper[4802]: I1201 20:02:33.641481 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj" podUID="10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f" containerName="route-controller-manager" containerID="cri-o://8fa9bab85da1b5a8215a63b18643df367d0754ce1c64a2ee5c59240b01051223" gracePeriod=30 Dec 01 20:02:34 crc kubenswrapper[4802]: I1201 20:02:34.028364 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj" Dec 01 20:02:34 crc kubenswrapper[4802]: I1201 20:02:34.127449 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f-config\") pod \"10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f\" (UID: \"10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f\") " Dec 01 20:02:34 crc kubenswrapper[4802]: I1201 20:02:34.127758 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f-serving-cert\") pod \"10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f\" (UID: \"10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f\") " Dec 01 20:02:34 crc kubenswrapper[4802]: I1201 20:02:34.127821 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f-client-ca\") pod \"10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f\" (UID: \"10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f\") " Dec 01 20:02:34 crc kubenswrapper[4802]: I1201 20:02:34.127890 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qz6zj\" (UniqueName: \"kubernetes.io/projected/10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f-kube-api-access-qz6zj\") pod \"10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f\" (UID: \"10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f\") " Dec 01 20:02:34 crc kubenswrapper[4802]: I1201 20:02:34.128909 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f-client-ca" (OuterVolumeSpecName: "client-ca") pod "10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f" (UID: "10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:02:34 crc kubenswrapper[4802]: I1201 20:02:34.129050 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f-config" (OuterVolumeSpecName: "config") pod "10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f" (UID: "10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:02:34 crc kubenswrapper[4802]: I1201 20:02:34.133108 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f" (UID: "10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:02:34 crc kubenswrapper[4802]: I1201 20:02:34.133284 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f-kube-api-access-qz6zj" (OuterVolumeSpecName: "kube-api-access-qz6zj") pod "10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f" (UID: "10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f"). InnerVolumeSpecName "kube-api-access-qz6zj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:02:34 crc kubenswrapper[4802]: I1201 20:02:34.229226 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qz6zj\" (UniqueName: \"kubernetes.io/projected/10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f-kube-api-access-qz6zj\") on node \"crc\" DevicePath \"\"" Dec 01 20:02:34 crc kubenswrapper[4802]: I1201 20:02:34.229269 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:02:34 crc kubenswrapper[4802]: I1201 20:02:34.229283 4802 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 20:02:34 crc kubenswrapper[4802]: I1201 20:02:34.229295 4802 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 20:02:34 crc kubenswrapper[4802]: I1201 20:02:34.469672 4802 generic.go:334] "Generic (PLEG): container finished" podID="10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f" containerID="8fa9bab85da1b5a8215a63b18643df367d0754ce1c64a2ee5c59240b01051223" exitCode=0 Dec 01 20:02:34 crc kubenswrapper[4802]: I1201 20:02:34.469765 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj" Dec 01 20:02:34 crc kubenswrapper[4802]: I1201 20:02:34.469753 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj" event={"ID":"10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f","Type":"ContainerDied","Data":"8fa9bab85da1b5a8215a63b18643df367d0754ce1c64a2ee5c59240b01051223"} Dec 01 20:02:34 crc kubenswrapper[4802]: I1201 20:02:34.470389 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj" event={"ID":"10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f","Type":"ContainerDied","Data":"62cb104b1f74197214886f580de6be82d276560c062840c6cf347423d43acb89"} Dec 01 20:02:34 crc kubenswrapper[4802]: I1201 20:02:34.470525 4802 scope.go:117] "RemoveContainer" containerID="8fa9bab85da1b5a8215a63b18643df367d0754ce1c64a2ee5c59240b01051223" Dec 01 20:02:34 crc kubenswrapper[4802]: I1201 20:02:34.500288 4802 scope.go:117] "RemoveContainer" containerID="8fa9bab85da1b5a8215a63b18643df367d0754ce1c64a2ee5c59240b01051223" Dec 01 20:02:34 crc kubenswrapper[4802]: E1201 20:02:34.500864 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fa9bab85da1b5a8215a63b18643df367d0754ce1c64a2ee5c59240b01051223\": container with ID starting with 8fa9bab85da1b5a8215a63b18643df367d0754ce1c64a2ee5c59240b01051223 not found: ID does not exist" containerID="8fa9bab85da1b5a8215a63b18643df367d0754ce1c64a2ee5c59240b01051223" Dec 01 20:02:34 crc kubenswrapper[4802]: I1201 20:02:34.500922 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fa9bab85da1b5a8215a63b18643df367d0754ce1c64a2ee5c59240b01051223"} err="failed to get container status \"8fa9bab85da1b5a8215a63b18643df367d0754ce1c64a2ee5c59240b01051223\": rpc error: code = NotFound desc = could not find container \"8fa9bab85da1b5a8215a63b18643df367d0754ce1c64a2ee5c59240b01051223\": container with ID starting with 8fa9bab85da1b5a8215a63b18643df367d0754ce1c64a2ee5c59240b01051223 not found: ID does not exist" Dec 01 20:02:34 crc kubenswrapper[4802]: I1201 20:02:34.508501 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj"] Dec 01 20:02:34 crc kubenswrapper[4802]: I1201 20:02:34.515230 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-559846b6c5-9kxqj"] Dec 01 20:02:34 crc kubenswrapper[4802]: I1201 20:02:34.728258 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f" path="/var/lib/kubelet/pods/10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f/volumes" Dec 01 20:02:35 crc kubenswrapper[4802]: I1201 20:02:35.239711 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776ddd45f7-n4kpx"] Dec 01 20:02:35 crc kubenswrapper[4802]: E1201 20:02:35.240027 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f" containerName="route-controller-manager" Dec 01 20:02:35 crc kubenswrapper[4802]: I1201 20:02:35.240048 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f" containerName="route-controller-manager" Dec 01 20:02:35 crc kubenswrapper[4802]: E1201 20:02:35.240068 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e59b4a8c-e6d3-4b2b-900f-0098c1f863f3" containerName="registry" Dec 01 20:02:35 crc kubenswrapper[4802]: I1201 20:02:35.240080 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="e59b4a8c-e6d3-4b2b-900f-0098c1f863f3" containerName="registry" Dec 01 20:02:35 crc kubenswrapper[4802]: I1201 20:02:35.240299 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="e59b4a8c-e6d3-4b2b-900f-0098c1f863f3" containerName="registry" Dec 01 20:02:35 crc kubenswrapper[4802]: I1201 20:02:35.240325 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="10ff1dc7-b9d3-4fe6-a5be-a61bb8df887f" containerName="route-controller-manager" Dec 01 20:02:35 crc kubenswrapper[4802]: I1201 20:02:35.241046 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-n4kpx" Dec 01 20:02:35 crc kubenswrapper[4802]: I1201 20:02:35.243261 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 20:02:35 crc kubenswrapper[4802]: I1201 20:02:35.244968 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 20:02:35 crc kubenswrapper[4802]: I1201 20:02:35.244991 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 20:02:35 crc kubenswrapper[4802]: I1201 20:02:35.245250 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 20:02:35 crc kubenswrapper[4802]: I1201 20:02:35.245297 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 20:02:35 crc kubenswrapper[4802]: I1201 20:02:35.245377 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 20:02:35 crc kubenswrapper[4802]: I1201 20:02:35.253606 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776ddd45f7-n4kpx"] Dec 01 20:02:35 crc kubenswrapper[4802]: I1201 20:02:35.343744 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4150a3cd-e95e-444c-a301-6fc3016c0550-client-ca\") pod \"route-controller-manager-776ddd45f7-n4kpx\" (UID: \"4150a3cd-e95e-444c-a301-6fc3016c0550\") " pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-n4kpx" Dec 01 20:02:35 crc kubenswrapper[4802]: I1201 20:02:35.343817 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4150a3cd-e95e-444c-a301-6fc3016c0550-serving-cert\") pod \"route-controller-manager-776ddd45f7-n4kpx\" (UID: \"4150a3cd-e95e-444c-a301-6fc3016c0550\") " pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-n4kpx" Dec 01 20:02:35 crc kubenswrapper[4802]: I1201 20:02:35.343906 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5zg9\" (UniqueName: \"kubernetes.io/projected/4150a3cd-e95e-444c-a301-6fc3016c0550-kube-api-access-k5zg9\") pod \"route-controller-manager-776ddd45f7-n4kpx\" (UID: \"4150a3cd-e95e-444c-a301-6fc3016c0550\") " pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-n4kpx" Dec 01 20:02:35 crc kubenswrapper[4802]: I1201 20:02:35.343994 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4150a3cd-e95e-444c-a301-6fc3016c0550-config\") pod \"route-controller-manager-776ddd45f7-n4kpx\" (UID: \"4150a3cd-e95e-444c-a301-6fc3016c0550\") " pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-n4kpx" Dec 01 20:02:35 crc kubenswrapper[4802]: I1201 20:02:35.445696 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4150a3cd-e95e-444c-a301-6fc3016c0550-config\") pod \"route-controller-manager-776ddd45f7-n4kpx\" (UID: \"4150a3cd-e95e-444c-a301-6fc3016c0550\") " pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-n4kpx" Dec 01 20:02:35 crc kubenswrapper[4802]: I1201 20:02:35.445842 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4150a3cd-e95e-444c-a301-6fc3016c0550-client-ca\") pod \"route-controller-manager-776ddd45f7-n4kpx\" (UID: \"4150a3cd-e95e-444c-a301-6fc3016c0550\") " pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-n4kpx" Dec 01 20:02:35 crc kubenswrapper[4802]: I1201 20:02:35.445914 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4150a3cd-e95e-444c-a301-6fc3016c0550-serving-cert\") pod \"route-controller-manager-776ddd45f7-n4kpx\" (UID: \"4150a3cd-e95e-444c-a301-6fc3016c0550\") " pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-n4kpx" Dec 01 20:02:35 crc kubenswrapper[4802]: I1201 20:02:35.446007 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5zg9\" (UniqueName: \"kubernetes.io/projected/4150a3cd-e95e-444c-a301-6fc3016c0550-kube-api-access-k5zg9\") pod \"route-controller-manager-776ddd45f7-n4kpx\" (UID: \"4150a3cd-e95e-444c-a301-6fc3016c0550\") " pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-n4kpx" Dec 01 20:02:35 crc kubenswrapper[4802]: I1201 20:02:35.447633 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4150a3cd-e95e-444c-a301-6fc3016c0550-client-ca\") pod \"route-controller-manager-776ddd45f7-n4kpx\" (UID: \"4150a3cd-e95e-444c-a301-6fc3016c0550\") " pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-n4kpx" Dec 01 20:02:35 crc kubenswrapper[4802]: I1201 20:02:35.448592 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4150a3cd-e95e-444c-a301-6fc3016c0550-config\") pod \"route-controller-manager-776ddd45f7-n4kpx\" (UID: \"4150a3cd-e95e-444c-a301-6fc3016c0550\") " pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-n4kpx" Dec 01 20:02:35 crc kubenswrapper[4802]: I1201 20:02:35.453295 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4150a3cd-e95e-444c-a301-6fc3016c0550-serving-cert\") pod \"route-controller-manager-776ddd45f7-n4kpx\" (UID: \"4150a3cd-e95e-444c-a301-6fc3016c0550\") " pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-n4kpx" Dec 01 20:02:35 crc kubenswrapper[4802]: I1201 20:02:35.480764 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5zg9\" (UniqueName: \"kubernetes.io/projected/4150a3cd-e95e-444c-a301-6fc3016c0550-kube-api-access-k5zg9\") pod \"route-controller-manager-776ddd45f7-n4kpx\" (UID: \"4150a3cd-e95e-444c-a301-6fc3016c0550\") " pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-n4kpx" Dec 01 20:02:35 crc kubenswrapper[4802]: I1201 20:02:35.567578 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-n4kpx" Dec 01 20:02:35 crc kubenswrapper[4802]: I1201 20:02:35.995893 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776ddd45f7-n4kpx"] Dec 01 20:02:36 crc kubenswrapper[4802]: I1201 20:02:36.489033 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-n4kpx" event={"ID":"4150a3cd-e95e-444c-a301-6fc3016c0550","Type":"ContainerStarted","Data":"0efc168bbccf7c5e3a4223089d1de4520031f897b868f4c04532f8182258cc6d"} Dec 01 20:02:36 crc kubenswrapper[4802]: I1201 20:02:36.489075 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-n4kpx" event={"ID":"4150a3cd-e95e-444c-a301-6fc3016c0550","Type":"ContainerStarted","Data":"d26409ba81d64e5ac56293f3b4e9fe9300e8fad88af80d26204c153c034fd587"} Dec 01 20:02:36 crc kubenswrapper[4802]: I1201 20:02:36.489825 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-n4kpx" Dec 01 20:02:36 crc kubenswrapper[4802]: I1201 20:02:36.510219 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-n4kpx" podStartSLOduration=3.510186455 podStartE2EDuration="3.510186455s" podCreationTimestamp="2025-12-01 20:02:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:02:36.508637587 +0000 UTC m=+378.071197308" watchObservedRunningTime="2025-12-01 20:02:36.510186455 +0000 UTC m=+378.072746096" Dec 01 20:02:36 crc kubenswrapper[4802]: I1201 20:02:36.966815 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-776ddd45f7-n4kpx" Dec 01 20:02:58 crc kubenswrapper[4802]: I1201 20:02:58.089629 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:02:58 crc kubenswrapper[4802]: I1201 20:02:58.091358 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:03:28 crc kubenswrapper[4802]: I1201 20:03:28.088934 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:03:28 crc kubenswrapper[4802]: I1201 20:03:28.089870 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:03:28 crc kubenswrapper[4802]: I1201 20:03:28.089982 4802 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 20:03:28 crc kubenswrapper[4802]: I1201 20:03:28.090844 4802 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"41318f41b1d38715fc9ba5256975e0f7c998f38cecb3c870cf77cc1261f579e2"} pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 20:03:28 crc kubenswrapper[4802]: I1201 20:03:28.090953 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" containerID="cri-o://41318f41b1d38715fc9ba5256975e0f7c998f38cecb3c870cf77cc1261f579e2" gracePeriod=600 Dec 01 20:03:28 crc kubenswrapper[4802]: I1201 20:03:28.819903 4802 generic.go:334] "Generic (PLEG): container finished" podID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerID="41318f41b1d38715fc9ba5256975e0f7c998f38cecb3c870cf77cc1261f579e2" exitCode=0 Dec 01 20:03:28 crc kubenswrapper[4802]: I1201 20:03:28.819995 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerDied","Data":"41318f41b1d38715fc9ba5256975e0f7c998f38cecb3c870cf77cc1261f579e2"} Dec 01 20:03:28 crc kubenswrapper[4802]: I1201 20:03:28.820377 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerStarted","Data":"de9d3c97b8a2a0443baba6eb01858a0cfeece219a2df500768ee3b289f9dd2d5"} Dec 01 20:03:28 crc kubenswrapper[4802]: I1201 20:03:28.820408 4802 scope.go:117] "RemoveContainer" containerID="3559474262a4b595e645b7eba2d1009977f6ae2459320bfefe2a5bbff48f1809" Dec 01 20:05:28 crc kubenswrapper[4802]: I1201 20:05:28.089483 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:05:28 crc kubenswrapper[4802]: I1201 20:05:28.091427 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:05:58 crc kubenswrapper[4802]: I1201 20:05:58.088616 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:05:58 crc kubenswrapper[4802]: I1201 20:05:58.089769 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:06:28 crc kubenswrapper[4802]: I1201 20:06:28.089306 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:06:28 crc kubenswrapper[4802]: I1201 20:06:28.090323 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:06:28 crc kubenswrapper[4802]: I1201 20:06:28.090399 4802 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 20:06:28 crc kubenswrapper[4802]: I1201 20:06:28.091052 4802 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"de9d3c97b8a2a0443baba6eb01858a0cfeece219a2df500768ee3b289f9dd2d5"} pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 20:06:28 crc kubenswrapper[4802]: I1201 20:06:28.091134 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" containerID="cri-o://de9d3c97b8a2a0443baba6eb01858a0cfeece219a2df500768ee3b289f9dd2d5" gracePeriod=600 Dec 01 20:06:29 crc kubenswrapper[4802]: I1201 20:06:29.006006 4802 generic.go:334] "Generic (PLEG): container finished" podID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerID="de9d3c97b8a2a0443baba6eb01858a0cfeece219a2df500768ee3b289f9dd2d5" exitCode=0 Dec 01 20:06:29 crc kubenswrapper[4802]: I1201 20:06:29.006354 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerDied","Data":"de9d3c97b8a2a0443baba6eb01858a0cfeece219a2df500768ee3b289f9dd2d5"} Dec 01 20:06:29 crc kubenswrapper[4802]: I1201 20:06:29.007398 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerStarted","Data":"7b877900cbea92263f9945a8b0d73242a0986bd9839c145102aaab83d242fee9"} Dec 01 20:06:29 crc kubenswrapper[4802]: I1201 20:06:29.007499 4802 scope.go:117] "RemoveContainer" containerID="41318f41b1d38715fc9ba5256975e0f7c998f38cecb3c870cf77cc1261f579e2" Dec 01 20:08:28 crc kubenswrapper[4802]: I1201 20:08:28.088150 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:08:28 crc kubenswrapper[4802]: I1201 20:08:28.088695 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:08:58 crc kubenswrapper[4802]: I1201 20:08:58.088848 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:08:58 crc kubenswrapper[4802]: I1201 20:08:58.089495 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:09:09 crc kubenswrapper[4802]: I1201 20:09:09.120589 4802 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 20:09:28 crc kubenswrapper[4802]: I1201 20:09:28.089672 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:09:28 crc kubenswrapper[4802]: I1201 20:09:28.090452 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:09:28 crc kubenswrapper[4802]: I1201 20:09:28.090526 4802 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 20:09:28 crc kubenswrapper[4802]: I1201 20:09:28.091497 4802 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7b877900cbea92263f9945a8b0d73242a0986bd9839c145102aaab83d242fee9"} pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 20:09:28 crc kubenswrapper[4802]: I1201 20:09:28.091617 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" containerID="cri-o://7b877900cbea92263f9945a8b0d73242a0986bd9839c145102aaab83d242fee9" gracePeriod=600 Dec 01 20:09:29 crc kubenswrapper[4802]: I1201 20:09:29.108017 4802 generic.go:334] "Generic (PLEG): container finished" podID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerID="7b877900cbea92263f9945a8b0d73242a0986bd9839c145102aaab83d242fee9" exitCode=0 Dec 01 20:09:29 crc kubenswrapper[4802]: I1201 20:09:29.108103 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerDied","Data":"7b877900cbea92263f9945a8b0d73242a0986bd9839c145102aaab83d242fee9"} Dec 01 20:09:29 crc kubenswrapper[4802]: I1201 20:09:29.108825 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerStarted","Data":"b5dce8f18ad191d77889a5744c4581ae578cf1b86f80207070351f56ae6cf862"} Dec 01 20:09:29 crc kubenswrapper[4802]: I1201 20:09:29.108858 4802 scope.go:117] "RemoveContainer" containerID="de9d3c97b8a2a0443baba6eb01858a0cfeece219a2df500768ee3b289f9dd2d5" Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.040656 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-r5lnb"] Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.042532 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-r5lnb" Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.049260 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.049475 4802 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-zgtw9" Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.049608 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.051511 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-7zjnt"] Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.052466 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-7zjnt" Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.053892 4802 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-fxvkv" Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.066082 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-r5lnb"] Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.069443 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-7zjnt"] Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.072342 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-p86w8"] Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.073681 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-p86w8" Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.079443 4802 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-4gwzh" Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.080750 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-p86w8"] Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.174299 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plhll\" (UniqueName: \"kubernetes.io/projected/b8339c52-f023-4f4c-9cf2-948f94a27e7a-kube-api-access-plhll\") pod \"cert-manager-webhook-5655c58dd6-p86w8\" (UID: \"b8339c52-f023-4f4c-9cf2-948f94a27e7a\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-p86w8" Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.174395 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m6l9\" (UniqueName: \"kubernetes.io/projected/2608cc8e-13d1-43b6-b033-1b62df0333fb-kube-api-access-9m6l9\") pod \"cert-manager-cainjector-7f985d654d-r5lnb\" (UID: \"2608cc8e-13d1-43b6-b033-1b62df0333fb\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-r5lnb" Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.174433 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf9xc\" (UniqueName: \"kubernetes.io/projected/07ba6850-9e9e-42d2-bd61-dc97bc185119-kube-api-access-hf9xc\") pod \"cert-manager-5b446d88c5-7zjnt\" (UID: \"07ba6850-9e9e-42d2-bd61-dc97bc185119\") " pod="cert-manager/cert-manager-5b446d88c5-7zjnt" Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.276177 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m6l9\" (UniqueName: \"kubernetes.io/projected/2608cc8e-13d1-43b6-b033-1b62df0333fb-kube-api-access-9m6l9\") pod \"cert-manager-cainjector-7f985d654d-r5lnb\" (UID: \"2608cc8e-13d1-43b6-b033-1b62df0333fb\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-r5lnb" Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.276262 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf9xc\" (UniqueName: \"kubernetes.io/projected/07ba6850-9e9e-42d2-bd61-dc97bc185119-kube-api-access-hf9xc\") pod \"cert-manager-5b446d88c5-7zjnt\" (UID: \"07ba6850-9e9e-42d2-bd61-dc97bc185119\") " pod="cert-manager/cert-manager-5b446d88c5-7zjnt" Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.276309 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plhll\" (UniqueName: \"kubernetes.io/projected/b8339c52-f023-4f4c-9cf2-948f94a27e7a-kube-api-access-plhll\") pod \"cert-manager-webhook-5655c58dd6-p86w8\" (UID: \"b8339c52-f023-4f4c-9cf2-948f94a27e7a\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-p86w8" Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.298236 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf9xc\" (UniqueName: \"kubernetes.io/projected/07ba6850-9e9e-42d2-bd61-dc97bc185119-kube-api-access-hf9xc\") pod \"cert-manager-5b446d88c5-7zjnt\" (UID: \"07ba6850-9e9e-42d2-bd61-dc97bc185119\") " pod="cert-manager/cert-manager-5b446d88c5-7zjnt" Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.298904 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plhll\" (UniqueName: \"kubernetes.io/projected/b8339c52-f023-4f4c-9cf2-948f94a27e7a-kube-api-access-plhll\") pod \"cert-manager-webhook-5655c58dd6-p86w8\" (UID: \"b8339c52-f023-4f4c-9cf2-948f94a27e7a\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-p86w8" Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.300309 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m6l9\" (UniqueName: \"kubernetes.io/projected/2608cc8e-13d1-43b6-b033-1b62df0333fb-kube-api-access-9m6l9\") pod \"cert-manager-cainjector-7f985d654d-r5lnb\" (UID: \"2608cc8e-13d1-43b6-b033-1b62df0333fb\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-r5lnb" Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.369347 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-r5lnb" Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.381484 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-7zjnt" Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.391725 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-p86w8" Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.572558 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-r5lnb"] Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.585459 4802 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.830411 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-7zjnt"] Dec 01 20:09:44 crc kubenswrapper[4802]: W1201 20:09:44.833979 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07ba6850_9e9e_42d2_bd61_dc97bc185119.slice/crio-a86129521f9448495a8dba2f4de693041de78be4a973fc6f61ef046781ad3ee7 WatchSource:0}: Error finding container a86129521f9448495a8dba2f4de693041de78be4a973fc6f61ef046781ad3ee7: Status 404 returned error can't find the container with id a86129521f9448495a8dba2f4de693041de78be4a973fc6f61ef046781ad3ee7 Dec 01 20:09:44 crc kubenswrapper[4802]: I1201 20:09:44.834264 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-p86w8"] Dec 01 20:09:44 crc kubenswrapper[4802]: W1201 20:09:44.839088 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8339c52_f023_4f4c_9cf2_948f94a27e7a.slice/crio-0ae4a34dbe41f984201d2556b3f7d7c3def4dfe3cb0ee48e140dcbe2d8c22c97 WatchSource:0}: Error finding container 0ae4a34dbe41f984201d2556b3f7d7c3def4dfe3cb0ee48e140dcbe2d8c22c97: Status 404 returned error can't find the container with id 0ae4a34dbe41f984201d2556b3f7d7c3def4dfe3cb0ee48e140dcbe2d8c22c97 Dec 01 20:09:45 crc kubenswrapper[4802]: I1201 20:09:45.202119 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-p86w8" event={"ID":"b8339c52-f023-4f4c-9cf2-948f94a27e7a","Type":"ContainerStarted","Data":"0ae4a34dbe41f984201d2556b3f7d7c3def4dfe3cb0ee48e140dcbe2d8c22c97"} Dec 01 20:09:45 crc kubenswrapper[4802]: I1201 20:09:45.203106 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-7zjnt" event={"ID":"07ba6850-9e9e-42d2-bd61-dc97bc185119","Type":"ContainerStarted","Data":"a86129521f9448495a8dba2f4de693041de78be4a973fc6f61ef046781ad3ee7"} Dec 01 20:09:45 crc kubenswrapper[4802]: I1201 20:09:45.204274 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-r5lnb" event={"ID":"2608cc8e-13d1-43b6-b033-1b62df0333fb","Type":"ContainerStarted","Data":"69779e3cbbab3aa6f88f7ca9f5942c1f7ce312e49e4e3143dab75b78a87b5584"} Dec 01 20:09:49 crc kubenswrapper[4802]: I1201 20:09:49.226114 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-r5lnb" event={"ID":"2608cc8e-13d1-43b6-b033-1b62df0333fb","Type":"ContainerStarted","Data":"a6579842f7e78c6201d59d07a1d2e0661fc5c4b5b607d9c2b36f830ef511c06e"} Dec 01 20:09:49 crc kubenswrapper[4802]: I1201 20:09:49.228309 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-p86w8" event={"ID":"b8339c52-f023-4f4c-9cf2-948f94a27e7a","Type":"ContainerStarted","Data":"0e3fce6fa927db368665a75ac9a02597247f1c1603b27729227a69f8ca6b39b9"} Dec 01 20:09:49 crc kubenswrapper[4802]: I1201 20:09:49.228404 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-p86w8" Dec 01 20:09:49 crc kubenswrapper[4802]: I1201 20:09:49.230284 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-7zjnt" event={"ID":"07ba6850-9e9e-42d2-bd61-dc97bc185119","Type":"ContainerStarted","Data":"1a56ed484ebbab80a9cd0e7a8d029720f48e6523dcfaa7c9a73e59bc7ce96342"} Dec 01 20:09:49 crc kubenswrapper[4802]: I1201 20:09:49.251277 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-r5lnb" podStartSLOduration=1.484285091 podStartE2EDuration="5.251258096s" podCreationTimestamp="2025-12-01 20:09:44 +0000 UTC" firstStartedPulling="2025-12-01 20:09:44.585166584 +0000 UTC m=+806.147726225" lastFinishedPulling="2025-12-01 20:09:48.352139569 +0000 UTC m=+809.914699230" observedRunningTime="2025-12-01 20:09:49.24655321 +0000 UTC m=+810.809112871" watchObservedRunningTime="2025-12-01 20:09:49.251258096 +0000 UTC m=+810.813817757" Dec 01 20:09:49 crc kubenswrapper[4802]: I1201 20:09:49.266104 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-7zjnt" podStartSLOduration=1.746701085 podStartE2EDuration="5.266080599s" podCreationTimestamp="2025-12-01 20:09:44 +0000 UTC" firstStartedPulling="2025-12-01 20:09:44.836608285 +0000 UTC m=+806.399167926" lastFinishedPulling="2025-12-01 20:09:48.355987789 +0000 UTC m=+809.918547440" observedRunningTime="2025-12-01 20:09:49.261034722 +0000 UTC m=+810.823594383" watchObservedRunningTime="2025-12-01 20:09:49.266080599 +0000 UTC m=+810.828640240" Dec 01 20:09:49 crc kubenswrapper[4802]: I1201 20:09:49.286537 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-p86w8" podStartSLOduration=1.767257675 podStartE2EDuration="5.286501425s" podCreationTimestamp="2025-12-01 20:09:44 +0000 UTC" firstStartedPulling="2025-12-01 20:09:44.841252619 +0000 UTC m=+806.403812260" lastFinishedPulling="2025-12-01 20:09:48.360496359 +0000 UTC m=+809.923056010" observedRunningTime="2025-12-01 20:09:49.283162591 +0000 UTC m=+810.845722242" watchObservedRunningTime="2025-12-01 20:09:49.286501425 +0000 UTC m=+810.849061076" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.192680 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-t7nr2"] Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.193778 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="ovn-controller" containerID="cri-o://5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7" gracePeriod=30 Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.193858 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2" gracePeriod=30 Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.193850 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="nbdb" containerID="cri-o://f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f" gracePeriod=30 Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.193976 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="ovn-acl-logging" containerID="cri-o://3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3" gracePeriod=30 Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.194026 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="kube-rbac-proxy-node" containerID="cri-o://ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6" gracePeriod=30 Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.194023 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="northd" containerID="cri-o://2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119" gracePeriod=30 Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.194145 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="sbdb" containerID="cri-o://55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2" gracePeriod=30 Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.247239 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="ovnkube-controller" containerID="cri-o://f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb" gracePeriod=30 Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.395420 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-p86w8" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.558793 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7nr2_933fb25a-a01a-464e-838a-df1d07bca99e/ovnkube-controller/3.log" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.562057 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7nr2_933fb25a-a01a-464e-838a-df1d07bca99e/ovn-acl-logging/0.log" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.562806 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7nr2_933fb25a-a01a-464e-838a-df1d07bca99e/ovn-controller/0.log" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.563662 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.617495 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mdpvz"] Dec 01 20:09:54 crc kubenswrapper[4802]: E1201 20:09:54.617701 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="ovn-controller" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.617713 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="ovn-controller" Dec 01 20:09:54 crc kubenswrapper[4802]: E1201 20:09:54.617722 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="ovnkube-controller" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.617727 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="ovnkube-controller" Dec 01 20:09:54 crc kubenswrapper[4802]: E1201 20:09:54.617735 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="ovn-acl-logging" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.617741 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="ovn-acl-logging" Dec 01 20:09:54 crc kubenswrapper[4802]: E1201 20:09:54.617750 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="kubecfg-setup" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.617757 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="kubecfg-setup" Dec 01 20:09:54 crc kubenswrapper[4802]: E1201 20:09:54.617765 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="nbdb" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.617770 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="nbdb" Dec 01 20:09:54 crc kubenswrapper[4802]: E1201 20:09:54.617778 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="kube-rbac-proxy-node" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.617784 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="kube-rbac-proxy-node" Dec 01 20:09:54 crc kubenswrapper[4802]: E1201 20:09:54.617796 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="sbdb" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.617802 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="sbdb" Dec 01 20:09:54 crc kubenswrapper[4802]: E1201 20:09:54.617811 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="ovnkube-controller" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.617818 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="ovnkube-controller" Dec 01 20:09:54 crc kubenswrapper[4802]: E1201 20:09:54.617826 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="ovnkube-controller" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.617833 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="ovnkube-controller" Dec 01 20:09:54 crc kubenswrapper[4802]: E1201 20:09:54.617843 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.617849 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 20:09:54 crc kubenswrapper[4802]: E1201 20:09:54.617857 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="northd" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.617863 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="northd" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.617946 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="ovnkube-controller" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.617955 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="ovnkube-controller" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.617962 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="sbdb" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.617971 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="ovnkube-controller" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.617982 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="ovn-controller" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.617990 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="nbdb" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.617997 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="northd" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.618005 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="ovn-acl-logging" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.618013 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.618022 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="kube-rbac-proxy-node" Dec 01 20:09:54 crc kubenswrapper[4802]: E1201 20:09:54.618109 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="ovnkube-controller" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.618115 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="ovnkube-controller" Dec 01 20:09:54 crc kubenswrapper[4802]: E1201 20:09:54.618124 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="ovnkube-controller" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.618129 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="ovnkube-controller" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.618252 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="ovnkube-controller" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.618265 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" containerName="ovnkube-controller" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.620208 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.620858 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/933fb25a-a01a-464e-838a-df1d07bca99e-ovnkube-script-lib\") pod \"933fb25a-a01a-464e-838a-df1d07bca99e\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.620916 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-slash\") pod \"933fb25a-a01a-464e-838a-df1d07bca99e\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.620946 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-run-ovn-kubernetes\") pod \"933fb25a-a01a-464e-838a-df1d07bca99e\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.620973 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-var-lib-openvswitch\") pod \"933fb25a-a01a-464e-838a-df1d07bca99e\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.620992 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/933fb25a-a01a-464e-838a-df1d07bca99e-env-overrides\") pod \"933fb25a-a01a-464e-838a-df1d07bca99e\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621021 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/933fb25a-a01a-464e-838a-df1d07bca99e-ovn-node-metrics-cert\") pod \"933fb25a-a01a-464e-838a-df1d07bca99e\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621051 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-run-ovn\") pod \"933fb25a-a01a-464e-838a-df1d07bca99e\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621073 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-run-netns\") pod \"933fb25a-a01a-464e-838a-df1d07bca99e\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621065 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "933fb25a-a01a-464e-838a-df1d07bca99e" (UID: "933fb25a-a01a-464e-838a-df1d07bca99e"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621122 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "933fb25a-a01a-464e-838a-df1d07bca99e" (UID: "933fb25a-a01a-464e-838a-df1d07bca99e"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621091 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-slash" (OuterVolumeSpecName: "host-slash") pod "933fb25a-a01a-464e-838a-df1d07bca99e" (UID: "933fb25a-a01a-464e-838a-df1d07bca99e"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621158 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "933fb25a-a01a-464e-838a-df1d07bca99e" (UID: "933fb25a-a01a-464e-838a-df1d07bca99e"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621093 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-run-openvswitch\") pod \"933fb25a-a01a-464e-838a-df1d07bca99e\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621283 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9xgd\" (UniqueName: \"kubernetes.io/projected/933fb25a-a01a-464e-838a-df1d07bca99e-kube-api-access-d9xgd\") pod \"933fb25a-a01a-464e-838a-df1d07bca99e\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621177 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "933fb25a-a01a-464e-838a-df1d07bca99e" (UID: "933fb25a-a01a-464e-838a-df1d07bca99e"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621343 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-cni-netd\") pod \"933fb25a-a01a-464e-838a-df1d07bca99e\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621228 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "933fb25a-a01a-464e-838a-df1d07bca99e" (UID: "933fb25a-a01a-464e-838a-df1d07bca99e"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621374 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"933fb25a-a01a-464e-838a-df1d07bca99e\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621438 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "933fb25a-a01a-464e-838a-df1d07bca99e" (UID: "933fb25a-a01a-464e-838a-df1d07bca99e"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621528 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-etc-openvswitch\") pod \"933fb25a-a01a-464e-838a-df1d07bca99e\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621618 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-log-socket\") pod \"933fb25a-a01a-464e-838a-df1d07bca99e\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621666 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-kubelet\") pod \"933fb25a-a01a-464e-838a-df1d07bca99e\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621715 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-systemd-units\") pod \"933fb25a-a01a-464e-838a-df1d07bca99e\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621745 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/933fb25a-a01a-464e-838a-df1d07bca99e-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "933fb25a-a01a-464e-838a-df1d07bca99e" (UID: "933fb25a-a01a-464e-838a-df1d07bca99e"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621760 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-node-log\") pod \"933fb25a-a01a-464e-838a-df1d07bca99e\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621786 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/933fb25a-a01a-464e-838a-df1d07bca99e-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "933fb25a-a01a-464e-838a-df1d07bca99e" (UID: "933fb25a-a01a-464e-838a-df1d07bca99e"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621802 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-log-socket" (OuterVolumeSpecName: "log-socket") pod "933fb25a-a01a-464e-838a-df1d07bca99e" (UID: "933fb25a-a01a-464e-838a-df1d07bca99e"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621811 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-run-systemd\") pod \"933fb25a-a01a-464e-838a-df1d07bca99e\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621825 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "933fb25a-a01a-464e-838a-df1d07bca99e" (UID: "933fb25a-a01a-464e-838a-df1d07bca99e"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621842 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-node-log" (OuterVolumeSpecName: "node-log") pod "933fb25a-a01a-464e-838a-df1d07bca99e" (UID: "933fb25a-a01a-464e-838a-df1d07bca99e"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621830 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "933fb25a-a01a-464e-838a-df1d07bca99e" (UID: "933fb25a-a01a-464e-838a-df1d07bca99e"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621855 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "933fb25a-a01a-464e-838a-df1d07bca99e" (UID: "933fb25a-a01a-464e-838a-df1d07bca99e"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621897 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/933fb25a-a01a-464e-838a-df1d07bca99e-ovnkube-config\") pod \"933fb25a-a01a-464e-838a-df1d07bca99e\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621959 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "933fb25a-a01a-464e-838a-df1d07bca99e" (UID: "933fb25a-a01a-464e-838a-df1d07bca99e"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.621996 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-cni-bin\") pod \"933fb25a-a01a-464e-838a-df1d07bca99e\" (UID: \"933fb25a-a01a-464e-838a-df1d07bca99e\") " Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.622089 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "933fb25a-a01a-464e-838a-df1d07bca99e" (UID: "933fb25a-a01a-464e-838a-df1d07bca99e"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.622563 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/933fb25a-a01a-464e-838a-df1d07bca99e-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "933fb25a-a01a-464e-838a-df1d07bca99e" (UID: "933fb25a-a01a-464e-838a-df1d07bca99e"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.622640 4802 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.622671 4802 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.622695 4802 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.622714 4802 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.622733 4802 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.622753 4802 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.622772 4802 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-log-socket\") on node \"crc\" DevicePath \"\"" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.622789 4802 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.622808 4802 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.622825 4802 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-node-log\") on node \"crc\" DevicePath \"\"" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.622842 4802 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.622859 4802 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/933fb25a-a01a-464e-838a-df1d07bca99e-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.622877 4802 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-slash\") on node \"crc\" DevicePath \"\"" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.622895 4802 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.622915 4802 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.622933 4802 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/933fb25a-a01a-464e-838a-df1d07bca99e-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.628529 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/933fb25a-a01a-464e-838a-df1d07bca99e-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "933fb25a-a01a-464e-838a-df1d07bca99e" (UID: "933fb25a-a01a-464e-838a-df1d07bca99e"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.628754 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/933fb25a-a01a-464e-838a-df1d07bca99e-kube-api-access-d9xgd" (OuterVolumeSpecName: "kube-api-access-d9xgd") pod "933fb25a-a01a-464e-838a-df1d07bca99e" (UID: "933fb25a-a01a-464e-838a-df1d07bca99e"). InnerVolumeSpecName "kube-api-access-d9xgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.639822 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "933fb25a-a01a-464e-838a-df1d07bca99e" (UID: "933fb25a-a01a-464e-838a-df1d07bca99e"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.723867 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l99n\" (UniqueName: \"kubernetes.io/projected/f9085207-962c-4399-a179-d5cbe58bb2a7-kube-api-access-9l99n\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.723920 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-host-cni-netd\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.723943 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-host-kubelet\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.723969 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.724002 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-host-slash\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.724126 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-etc-openvswitch\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.724187 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-run-ovn\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.724272 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-host-cni-bin\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.724323 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f9085207-962c-4399-a179-d5cbe58bb2a7-ovn-node-metrics-cert\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.724374 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f9085207-962c-4399-a179-d5cbe58bb2a7-ovnkube-config\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.724487 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-node-log\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.724532 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-systemd-units\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.724571 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-host-run-netns\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.724628 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-host-run-ovn-kubernetes\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.724683 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-run-systemd\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.724720 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-log-socket\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.724751 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f9085207-962c-4399-a179-d5cbe58bb2a7-env-overrides\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.724789 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-var-lib-openvswitch\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.724827 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f9085207-962c-4399-a179-d5cbe58bb2a7-ovnkube-script-lib\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.724863 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-run-openvswitch\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.725167 4802 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/933fb25a-a01a-464e-838a-df1d07bca99e-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.725222 4802 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/933fb25a-a01a-464e-838a-df1d07bca99e-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.725239 4802 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/933fb25a-a01a-464e-838a-df1d07bca99e-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.725254 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9xgd\" (UniqueName: \"kubernetes.io/projected/933fb25a-a01a-464e-838a-df1d07bca99e-kube-api-access-d9xgd\") on node \"crc\" DevicePath \"\"" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.826550 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-run-openvswitch\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.826654 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l99n\" (UniqueName: \"kubernetes.io/projected/f9085207-962c-4399-a179-d5cbe58bb2a7-kube-api-access-9l99n\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.826710 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-host-cni-netd\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.826747 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.826744 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-run-openvswitch\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.826821 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-host-kubelet\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.826873 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-host-cni-netd\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.826772 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-host-kubelet\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.826887 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.826923 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-host-slash\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.826955 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-etc-openvswitch\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.826976 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-run-ovn\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.827002 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-host-cni-bin\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.827029 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f9085207-962c-4399-a179-d5cbe58bb2a7-ovn-node-metrics-cert\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.827036 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-host-slash\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.827051 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f9085207-962c-4399-a179-d5cbe58bb2a7-ovnkube-config\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.827151 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-run-ovn\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.827185 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-node-log\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.827222 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-etc-openvswitch\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.827259 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-node-log\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.827268 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-host-cni-bin\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.827438 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-systemd-units\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.827537 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-host-run-netns\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.827611 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-host-run-ovn-kubernetes\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.827705 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-run-systemd\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.827778 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-log-socket\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.827815 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f9085207-962c-4399-a179-d5cbe58bb2a7-env-overrides\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.827866 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-var-lib-openvswitch\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.827909 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f9085207-962c-4399-a179-d5cbe58bb2a7-ovnkube-script-lib\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.828018 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-host-run-ovn-kubernetes\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.828089 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-var-lib-openvswitch\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.828024 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-run-systemd\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.828058 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-systemd-units\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.828060 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-host-run-netns\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.828360 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f9085207-962c-4399-a179-d5cbe58bb2a7-ovnkube-config\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.828494 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f9085207-962c-4399-a179-d5cbe58bb2a7-log-socket\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.828563 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f9085207-962c-4399-a179-d5cbe58bb2a7-env-overrides\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.829575 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f9085207-962c-4399-a179-d5cbe58bb2a7-ovnkube-script-lib\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.833255 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f9085207-962c-4399-a179-d5cbe58bb2a7-ovn-node-metrics-cert\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.851389 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l99n\" (UniqueName: \"kubernetes.io/projected/f9085207-962c-4399-a179-d5cbe58bb2a7-kube-api-access-9l99n\") pod \"ovnkube-node-mdpvz\" (UID: \"f9085207-962c-4399-a179-d5cbe58bb2a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:54 crc kubenswrapper[4802]: I1201 20:09:54.938617 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.275769 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7nr2_933fb25a-a01a-464e-838a-df1d07bca99e/ovnkube-controller/3.log" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.279436 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7nr2_933fb25a-a01a-464e-838a-df1d07bca99e/ovn-acl-logging/0.log" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.280180 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t7nr2_933fb25a-a01a-464e-838a-df1d07bca99e/ovn-controller/0.log" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.280622 4802 generic.go:334] "Generic (PLEG): container finished" podID="933fb25a-a01a-464e-838a-df1d07bca99e" containerID="f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb" exitCode=0 Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.280648 4802 generic.go:334] "Generic (PLEG): container finished" podID="933fb25a-a01a-464e-838a-df1d07bca99e" containerID="55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2" exitCode=0 Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.280659 4802 generic.go:334] "Generic (PLEG): container finished" podID="933fb25a-a01a-464e-838a-df1d07bca99e" containerID="f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f" exitCode=0 Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.280668 4802 generic.go:334] "Generic (PLEG): container finished" podID="933fb25a-a01a-464e-838a-df1d07bca99e" containerID="2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119" exitCode=0 Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.280679 4802 generic.go:334] "Generic (PLEG): container finished" podID="933fb25a-a01a-464e-838a-df1d07bca99e" containerID="0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2" exitCode=0 Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.280685 4802 generic.go:334] "Generic (PLEG): container finished" podID="933fb25a-a01a-464e-838a-df1d07bca99e" containerID="ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6" exitCode=0 Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.280692 4802 generic.go:334] "Generic (PLEG): container finished" podID="933fb25a-a01a-464e-838a-df1d07bca99e" containerID="3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3" exitCode=143 Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.280700 4802 generic.go:334] "Generic (PLEG): container finished" podID="933fb25a-a01a-464e-838a-df1d07bca99e" containerID="5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7" exitCode=143 Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.280691 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" event={"ID":"933fb25a-a01a-464e-838a-df1d07bca99e","Type":"ContainerDied","Data":"f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.280802 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" event={"ID":"933fb25a-a01a-464e-838a-df1d07bca99e","Type":"ContainerDied","Data":"55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.280840 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" event={"ID":"933fb25a-a01a-464e-838a-df1d07bca99e","Type":"ContainerDied","Data":"f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.280874 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" event={"ID":"933fb25a-a01a-464e-838a-df1d07bca99e","Type":"ContainerDied","Data":"2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.280900 4802 scope.go:117] "RemoveContainer" containerID="f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.280907 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" event={"ID":"933fb25a-a01a-464e-838a-df1d07bca99e","Type":"ContainerDied","Data":"0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.281075 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" event={"ID":"933fb25a-a01a-464e-838a-df1d07bca99e","Type":"ContainerDied","Data":"ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.281115 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.281147 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.281160 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.281173 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.281185 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.281353 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.281375 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.281390 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.281400 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.281404 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.281832 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" event={"ID":"933fb25a-a01a-464e-838a-df1d07bca99e","Type":"ContainerDied","Data":"3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.281870 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.281898 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.281914 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.281932 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.281950 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.281967 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.281983 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.282002 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.282018 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.282035 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.282059 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" event={"ID":"933fb25a-a01a-464e-838a-df1d07bca99e","Type":"ContainerDied","Data":"5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.282125 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.282145 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.282162 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.282177 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.282224 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.282242 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.282258 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.282275 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.282292 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.282307 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.282331 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7nr2" event={"ID":"933fb25a-a01a-464e-838a-df1d07bca99e","Type":"ContainerDied","Data":"6638e6250b02a048e64f9109c1084cd29aa6dec4c71066d84b55be130ca8d575"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.282359 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.282382 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.282398 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.282414 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.282433 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.282449 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.282465 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.282483 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.282499 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.282515 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.282948 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8zl28_bd82ca15-4489-4c15-aaf0-afb6b6787dc6/kube-multus/2.log" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.283599 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8zl28_bd82ca15-4489-4c15-aaf0-afb6b6787dc6/kube-multus/1.log" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.283700 4802 generic.go:334] "Generic (PLEG): container finished" podID="bd82ca15-4489-4c15-aaf0-afb6b6787dc6" containerID="ffc9ffc722d4b9500e320b758594376eca2d1523039741ca000571e4d2a4865b" exitCode=2 Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.283785 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8zl28" event={"ID":"bd82ca15-4489-4c15-aaf0-afb6b6787dc6","Type":"ContainerDied","Data":"ffc9ffc722d4b9500e320b758594376eca2d1523039741ca000571e4d2a4865b"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.283830 4802 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8bca0d5f90fcacce4e41c5c25c3297ad735feaac885581866866378106ae6d64"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.285082 4802 scope.go:117] "RemoveContainer" containerID="ffc9ffc722d4b9500e320b758594376eca2d1523039741ca000571e4d2a4865b" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.285795 4802 generic.go:334] "Generic (PLEG): container finished" podID="f9085207-962c-4399-a179-d5cbe58bb2a7" containerID="b549c1836d351bd4cadb0b4010acba7b2c68c04aaf88319c3746992871b01477" exitCode=0 Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.285853 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" event={"ID":"f9085207-962c-4399-a179-d5cbe58bb2a7","Type":"ContainerDied","Data":"b549c1836d351bd4cadb0b4010acba7b2c68c04aaf88319c3746992871b01477"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.285886 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" event={"ID":"f9085207-962c-4399-a179-d5cbe58bb2a7","Type":"ContainerStarted","Data":"568b5ec1f3a7ed628bf9d54193d6b6b7ae2674783348aa8ecef4efee0d6aae5a"} Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.320270 4802 scope.go:117] "RemoveContainer" containerID="5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.392116 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-t7nr2"] Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.398374 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-t7nr2"] Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.403552 4802 scope.go:117] "RemoveContainer" containerID="55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.445286 4802 scope.go:117] "RemoveContainer" containerID="f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.464728 4802 scope.go:117] "RemoveContainer" containerID="2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.484609 4802 scope.go:117] "RemoveContainer" containerID="0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.503421 4802 scope.go:117] "RemoveContainer" containerID="ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.523696 4802 scope.go:117] "RemoveContainer" containerID="3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.551308 4802 scope.go:117] "RemoveContainer" containerID="5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.591141 4802 scope.go:117] "RemoveContainer" containerID="c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.605818 4802 scope.go:117] "RemoveContainer" containerID="f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb" Dec 01 20:09:55 crc kubenswrapper[4802]: E1201 20:09:55.606444 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb\": container with ID starting with f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb not found: ID does not exist" containerID="f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.606506 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb"} err="failed to get container status \"f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb\": rpc error: code = NotFound desc = could not find container \"f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb\": container with ID starting with f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.606547 4802 scope.go:117] "RemoveContainer" containerID="5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d" Dec 01 20:09:55 crc kubenswrapper[4802]: E1201 20:09:55.607012 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d\": container with ID starting with 5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d not found: ID does not exist" containerID="5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.607052 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d"} err="failed to get container status \"5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d\": rpc error: code = NotFound desc = could not find container \"5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d\": container with ID starting with 5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.607080 4802 scope.go:117] "RemoveContainer" containerID="55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2" Dec 01 20:09:55 crc kubenswrapper[4802]: E1201 20:09:55.607373 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\": container with ID starting with 55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2 not found: ID does not exist" containerID="55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.607404 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2"} err="failed to get container status \"55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\": rpc error: code = NotFound desc = could not find container \"55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\": container with ID starting with 55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2 not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.607426 4802 scope.go:117] "RemoveContainer" containerID="f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f" Dec 01 20:09:55 crc kubenswrapper[4802]: E1201 20:09:55.607727 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\": container with ID starting with f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f not found: ID does not exist" containerID="f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.607754 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f"} err="failed to get container status \"f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\": rpc error: code = NotFound desc = could not find container \"f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\": container with ID starting with f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.607769 4802 scope.go:117] "RemoveContainer" containerID="2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119" Dec 01 20:09:55 crc kubenswrapper[4802]: E1201 20:09:55.608252 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\": container with ID starting with 2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119 not found: ID does not exist" containerID="2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.608298 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119"} err="failed to get container status \"2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\": rpc error: code = NotFound desc = could not find container \"2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\": container with ID starting with 2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119 not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.608328 4802 scope.go:117] "RemoveContainer" containerID="0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2" Dec 01 20:09:55 crc kubenswrapper[4802]: E1201 20:09:55.608727 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\": container with ID starting with 0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2 not found: ID does not exist" containerID="0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.608752 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2"} err="failed to get container status \"0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\": rpc error: code = NotFound desc = could not find container \"0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\": container with ID starting with 0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2 not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.608772 4802 scope.go:117] "RemoveContainer" containerID="ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6" Dec 01 20:09:55 crc kubenswrapper[4802]: E1201 20:09:55.609032 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\": container with ID starting with ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6 not found: ID does not exist" containerID="ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.609058 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6"} err="failed to get container status \"ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\": rpc error: code = NotFound desc = could not find container \"ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\": container with ID starting with ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6 not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.609079 4802 scope.go:117] "RemoveContainer" containerID="3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3" Dec 01 20:09:55 crc kubenswrapper[4802]: E1201 20:09:55.609349 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\": container with ID starting with 3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3 not found: ID does not exist" containerID="3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.609380 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3"} err="failed to get container status \"3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\": rpc error: code = NotFound desc = could not find container \"3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\": container with ID starting with 3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3 not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.609407 4802 scope.go:117] "RemoveContainer" containerID="5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7" Dec 01 20:09:55 crc kubenswrapper[4802]: E1201 20:09:55.609780 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\": container with ID starting with 5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7 not found: ID does not exist" containerID="5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.609818 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7"} err="failed to get container status \"5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\": rpc error: code = NotFound desc = could not find container \"5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\": container with ID starting with 5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7 not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.609839 4802 scope.go:117] "RemoveContainer" containerID="c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a" Dec 01 20:09:55 crc kubenswrapper[4802]: E1201 20:09:55.610259 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\": container with ID starting with c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a not found: ID does not exist" containerID="c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.610289 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a"} err="failed to get container status \"c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\": rpc error: code = NotFound desc = could not find container \"c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\": container with ID starting with c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.610315 4802 scope.go:117] "RemoveContainer" containerID="f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.610683 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb"} err="failed to get container status \"f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb\": rpc error: code = NotFound desc = could not find container \"f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb\": container with ID starting with f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.610703 4802 scope.go:117] "RemoveContainer" containerID="5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.611021 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d"} err="failed to get container status \"5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d\": rpc error: code = NotFound desc = could not find container \"5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d\": container with ID starting with 5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.611038 4802 scope.go:117] "RemoveContainer" containerID="55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.611313 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2"} err="failed to get container status \"55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\": rpc error: code = NotFound desc = could not find container \"55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\": container with ID starting with 55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2 not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.611338 4802 scope.go:117] "RemoveContainer" containerID="f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.611598 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f"} err="failed to get container status \"f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\": rpc error: code = NotFound desc = could not find container \"f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\": container with ID starting with f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.611622 4802 scope.go:117] "RemoveContainer" containerID="2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.611864 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119"} err="failed to get container status \"2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\": rpc error: code = NotFound desc = could not find container \"2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\": container with ID starting with 2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119 not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.611885 4802 scope.go:117] "RemoveContainer" containerID="0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.612253 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2"} err="failed to get container status \"0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\": rpc error: code = NotFound desc = could not find container \"0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\": container with ID starting with 0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2 not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.612276 4802 scope.go:117] "RemoveContainer" containerID="ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.612572 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6"} err="failed to get container status \"ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\": rpc error: code = NotFound desc = could not find container \"ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\": container with ID starting with ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6 not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.612598 4802 scope.go:117] "RemoveContainer" containerID="3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.613047 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3"} err="failed to get container status \"3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\": rpc error: code = NotFound desc = could not find container \"3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\": container with ID starting with 3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3 not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.613073 4802 scope.go:117] "RemoveContainer" containerID="5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.613402 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7"} err="failed to get container status \"5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\": rpc error: code = NotFound desc = could not find container \"5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\": container with ID starting with 5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7 not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.613423 4802 scope.go:117] "RemoveContainer" containerID="c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.613711 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a"} err="failed to get container status \"c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\": rpc error: code = NotFound desc = could not find container \"c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\": container with ID starting with c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.613737 4802 scope.go:117] "RemoveContainer" containerID="f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.614020 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb"} err="failed to get container status \"f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb\": rpc error: code = NotFound desc = could not find container \"f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb\": container with ID starting with f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.614042 4802 scope.go:117] "RemoveContainer" containerID="5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.614453 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d"} err="failed to get container status \"5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d\": rpc error: code = NotFound desc = could not find container \"5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d\": container with ID starting with 5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.614473 4802 scope.go:117] "RemoveContainer" containerID="55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.614725 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2"} err="failed to get container status \"55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\": rpc error: code = NotFound desc = could not find container \"55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\": container with ID starting with 55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2 not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.614743 4802 scope.go:117] "RemoveContainer" containerID="f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.615028 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f"} err="failed to get container status \"f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\": rpc error: code = NotFound desc = could not find container \"f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\": container with ID starting with f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.615049 4802 scope.go:117] "RemoveContainer" containerID="2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.615421 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119"} err="failed to get container status \"2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\": rpc error: code = NotFound desc = could not find container \"2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\": container with ID starting with 2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119 not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.615449 4802 scope.go:117] "RemoveContainer" containerID="0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.615959 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2"} err="failed to get container status \"0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\": rpc error: code = NotFound desc = could not find container \"0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\": container with ID starting with 0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2 not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.615980 4802 scope.go:117] "RemoveContainer" containerID="ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.616418 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6"} err="failed to get container status \"ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\": rpc error: code = NotFound desc = could not find container \"ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\": container with ID starting with ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6 not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.616440 4802 scope.go:117] "RemoveContainer" containerID="3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.616751 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3"} err="failed to get container status \"3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\": rpc error: code = NotFound desc = could not find container \"3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\": container with ID starting with 3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3 not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.616775 4802 scope.go:117] "RemoveContainer" containerID="5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.617479 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7"} err="failed to get container status \"5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\": rpc error: code = NotFound desc = could not find container \"5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\": container with ID starting with 5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7 not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.617506 4802 scope.go:117] "RemoveContainer" containerID="c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.619229 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a"} err="failed to get container status \"c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\": rpc error: code = NotFound desc = could not find container \"c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\": container with ID starting with c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.619264 4802 scope.go:117] "RemoveContainer" containerID="f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.619589 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb"} err="failed to get container status \"f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb\": rpc error: code = NotFound desc = could not find container \"f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb\": container with ID starting with f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.619608 4802 scope.go:117] "RemoveContainer" containerID="5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.621057 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d"} err="failed to get container status \"5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d\": rpc error: code = NotFound desc = could not find container \"5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d\": container with ID starting with 5dae40070bca5b57e65c561bd2a8e5bf3876f65dfa5a28f3951726af5deab04d not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.621085 4802 scope.go:117] "RemoveContainer" containerID="55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.621773 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2"} err="failed to get container status \"55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\": rpc error: code = NotFound desc = could not find container \"55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2\": container with ID starting with 55e86a60efc48cbc83aa9b1a9e9da48cbfeae2e453da104a0668ac9286dedfe2 not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.621796 4802 scope.go:117] "RemoveContainer" containerID="f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.621964 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f"} err="failed to get container status \"f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\": rpc error: code = NotFound desc = could not find container \"f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f\": container with ID starting with f5b439cff13ad404008fc13d04a98237c38abd615852c082faff082c0536a03f not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.621987 4802 scope.go:117] "RemoveContainer" containerID="2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.622176 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119"} err="failed to get container status \"2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\": rpc error: code = NotFound desc = could not find container \"2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119\": container with ID starting with 2db12492b14029504eef415e3a79a5260e4e42bd1ad1cd93673cefd79f1ad119 not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.622211 4802 scope.go:117] "RemoveContainer" containerID="0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.622374 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2"} err="failed to get container status \"0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\": rpc error: code = NotFound desc = could not find container \"0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2\": container with ID starting with 0276f46615a45adfff718e06cd1e92d741575d53b3aa00823c2c29e390109bb2 not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.622425 4802 scope.go:117] "RemoveContainer" containerID="ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.622633 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6"} err="failed to get container status \"ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\": rpc error: code = NotFound desc = could not find container \"ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6\": container with ID starting with ff599e9ea47431c6a7d2c5f82f5efc871d1aa0ee70e1423cbf153b957d8497f6 not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.622665 4802 scope.go:117] "RemoveContainer" containerID="3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.622857 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3"} err="failed to get container status \"3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\": rpc error: code = NotFound desc = could not find container \"3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3\": container with ID starting with 3d7e03a03d67c868db06ff90f62ce2b6230a659630bf9e3c7b2c86607f5133f3 not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.622882 4802 scope.go:117] "RemoveContainer" containerID="5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.623054 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7"} err="failed to get container status \"5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\": rpc error: code = NotFound desc = could not find container \"5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7\": container with ID starting with 5964c4bc6718290d1f9058323627241552a732d8d9efa4cafdc07b263fd8cbd7 not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.623074 4802 scope.go:117] "RemoveContainer" containerID="c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.623256 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a"} err="failed to get container status \"c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\": rpc error: code = NotFound desc = could not find container \"c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a\": container with ID starting with c98845f2793b7d311b0d4f395ad9bb6bd27097ee17384020a399788e93d5025a not found: ID does not exist" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.623279 4802 scope.go:117] "RemoveContainer" containerID="f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb" Dec 01 20:09:55 crc kubenswrapper[4802]: I1201 20:09:55.625336 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb"} err="failed to get container status \"f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb\": rpc error: code = NotFound desc = could not find container \"f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb\": container with ID starting with f6cb7b4f6e1872bdbedd61f2feb951c9314406dc54fc0a3fa060cd157eea78eb not found: ID does not exist" Dec 01 20:09:56 crc kubenswrapper[4802]: I1201 20:09:56.297886 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" event={"ID":"f9085207-962c-4399-a179-d5cbe58bb2a7","Type":"ContainerStarted","Data":"e3569d967207a36887782af434da841b19f7ec7e86121a5222fd4968f67b9ab6"} Dec 01 20:09:56 crc kubenswrapper[4802]: I1201 20:09:56.298336 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" event={"ID":"f9085207-962c-4399-a179-d5cbe58bb2a7","Type":"ContainerStarted","Data":"7a554982c4f7e92314b2531c6eed6d344ad558e519845131dd8b6a0d54942957"} Dec 01 20:09:56 crc kubenswrapper[4802]: I1201 20:09:56.298349 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" event={"ID":"f9085207-962c-4399-a179-d5cbe58bb2a7","Type":"ContainerStarted","Data":"7e15f69d2eba48218090e2af747b5e3501896bc99163bee70bb2dae82fe9813f"} Dec 01 20:09:56 crc kubenswrapper[4802]: I1201 20:09:56.298359 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" event={"ID":"f9085207-962c-4399-a179-d5cbe58bb2a7","Type":"ContainerStarted","Data":"d0dcb2e25541dbd0a1345e73c53c926e72a0610cb03a3c36efe65d9de9a3a13e"} Dec 01 20:09:56 crc kubenswrapper[4802]: I1201 20:09:56.298368 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" event={"ID":"f9085207-962c-4399-a179-d5cbe58bb2a7","Type":"ContainerStarted","Data":"d6149f74feb5e85ec4bf8537762e9d22c1ebbfcd69732856b652082c82adc8ec"} Dec 01 20:09:56 crc kubenswrapper[4802]: I1201 20:09:56.298378 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" event={"ID":"f9085207-962c-4399-a179-d5cbe58bb2a7","Type":"ContainerStarted","Data":"1a3329f9f1cc1cb27c6168d92daa2ff0c26bfd13f2180257549e1c6ce75fea6b"} Dec 01 20:09:56 crc kubenswrapper[4802]: I1201 20:09:56.301413 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8zl28_bd82ca15-4489-4c15-aaf0-afb6b6787dc6/kube-multus/2.log" Dec 01 20:09:56 crc kubenswrapper[4802]: I1201 20:09:56.301860 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8zl28_bd82ca15-4489-4c15-aaf0-afb6b6787dc6/kube-multus/1.log" Dec 01 20:09:56 crc kubenswrapper[4802]: I1201 20:09:56.301893 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8zl28" event={"ID":"bd82ca15-4489-4c15-aaf0-afb6b6787dc6","Type":"ContainerStarted","Data":"94d98c9166ff53a04ffc7861987ebe4f186d2ea4c62770fb485ea54282950c8e"} Dec 01 20:09:56 crc kubenswrapper[4802]: I1201 20:09:56.731785 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="933fb25a-a01a-464e-838a-df1d07bca99e" path="/var/lib/kubelet/pods/933fb25a-a01a-464e-838a-df1d07bca99e/volumes" Dec 01 20:09:59 crc kubenswrapper[4802]: I1201 20:09:59.330164 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" event={"ID":"f9085207-962c-4399-a179-d5cbe58bb2a7","Type":"ContainerStarted","Data":"590495f2a9bc2fe225d9033ce2736b0951430b6941501481733736d22b94a9dd"} Dec 01 20:10:01 crc kubenswrapper[4802]: I1201 20:10:01.351028 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" event={"ID":"f9085207-962c-4399-a179-d5cbe58bb2a7","Type":"ContainerStarted","Data":"4a8a197ee05439deb0f56679f8707a2b100efce3d3b9f4266c5974760280b6be"} Dec 01 20:10:01 crc kubenswrapper[4802]: I1201 20:10:01.351722 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:10:01 crc kubenswrapper[4802]: I1201 20:10:01.351742 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:10:01 crc kubenswrapper[4802]: I1201 20:10:01.351755 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:10:01 crc kubenswrapper[4802]: I1201 20:10:01.386815 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:10:01 crc kubenswrapper[4802]: I1201 20:10:01.389111 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:10:01 crc kubenswrapper[4802]: I1201 20:10:01.393418 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" podStartSLOduration=7.393405332 podStartE2EDuration="7.393405332s" podCreationTimestamp="2025-12-01 20:09:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:10:01.390311185 +0000 UTC m=+822.952870866" watchObservedRunningTime="2025-12-01 20:10:01.393405332 +0000 UTC m=+822.955964963" Dec 01 20:10:18 crc kubenswrapper[4802]: I1201 20:10:18.972804 4802 scope.go:117] "RemoveContainer" containerID="8bca0d5f90fcacce4e41c5c25c3297ad735feaac885581866866378106ae6d64" Dec 01 20:10:20 crc kubenswrapper[4802]: I1201 20:10:20.475978 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8zl28_bd82ca15-4489-4c15-aaf0-afb6b6787dc6/kube-multus/2.log" Dec 01 20:10:24 crc kubenswrapper[4802]: I1201 20:10:24.974456 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mdpvz" Dec 01 20:10:36 crc kubenswrapper[4802]: I1201 20:10:36.737218 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7"] Dec 01 20:10:36 crc kubenswrapper[4802]: I1201 20:10:36.738855 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7" Dec 01 20:10:36 crc kubenswrapper[4802]: I1201 20:10:36.742181 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 01 20:10:36 crc kubenswrapper[4802]: I1201 20:10:36.751348 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7"] Dec 01 20:10:36 crc kubenswrapper[4802]: I1201 20:10:36.818724 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvwcb\" (UniqueName: \"kubernetes.io/projected/cbe6c903-91e9-4c8d-b378-4be5220c8ab0-kube-api-access-vvwcb\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7\" (UID: \"cbe6c903-91e9-4c8d-b378-4be5220c8ab0\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7" Dec 01 20:10:36 crc kubenswrapper[4802]: I1201 20:10:36.819243 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbe6c903-91e9-4c8d-b378-4be5220c8ab0-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7\" (UID: \"cbe6c903-91e9-4c8d-b378-4be5220c8ab0\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7" Dec 01 20:10:36 crc kubenswrapper[4802]: I1201 20:10:36.819316 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbe6c903-91e9-4c8d-b378-4be5220c8ab0-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7\" (UID: \"cbe6c903-91e9-4c8d-b378-4be5220c8ab0\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7" Dec 01 20:10:36 crc kubenswrapper[4802]: I1201 20:10:36.920686 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbe6c903-91e9-4c8d-b378-4be5220c8ab0-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7\" (UID: \"cbe6c903-91e9-4c8d-b378-4be5220c8ab0\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7" Dec 01 20:10:36 crc kubenswrapper[4802]: I1201 20:10:36.921303 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbe6c903-91e9-4c8d-b378-4be5220c8ab0-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7\" (UID: \"cbe6c903-91e9-4c8d-b378-4be5220c8ab0\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7" Dec 01 20:10:36 crc kubenswrapper[4802]: I1201 20:10:36.921464 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvwcb\" (UniqueName: \"kubernetes.io/projected/cbe6c903-91e9-4c8d-b378-4be5220c8ab0-kube-api-access-vvwcb\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7\" (UID: \"cbe6c903-91e9-4c8d-b378-4be5220c8ab0\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7" Dec 01 20:10:36 crc kubenswrapper[4802]: I1201 20:10:36.921361 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbe6c903-91e9-4c8d-b378-4be5220c8ab0-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7\" (UID: \"cbe6c903-91e9-4c8d-b378-4be5220c8ab0\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7" Dec 01 20:10:36 crc kubenswrapper[4802]: I1201 20:10:36.921783 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbe6c903-91e9-4c8d-b378-4be5220c8ab0-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7\" (UID: \"cbe6c903-91e9-4c8d-b378-4be5220c8ab0\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7" Dec 01 20:10:36 crc kubenswrapper[4802]: I1201 20:10:36.945396 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvwcb\" (UniqueName: \"kubernetes.io/projected/cbe6c903-91e9-4c8d-b378-4be5220c8ab0-kube-api-access-vvwcb\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7\" (UID: \"cbe6c903-91e9-4c8d-b378-4be5220c8ab0\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7" Dec 01 20:10:37 crc kubenswrapper[4802]: I1201 20:10:37.071796 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7" Dec 01 20:10:37 crc kubenswrapper[4802]: I1201 20:10:37.541297 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7"] Dec 01 20:10:37 crc kubenswrapper[4802]: W1201 20:10:37.551795 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbe6c903_91e9_4c8d_b378_4be5220c8ab0.slice/crio-34f4c219cb6478eb021a075e08cb4f1ca633f97a7ff481bfdb430f00f3458e46 WatchSource:0}: Error finding container 34f4c219cb6478eb021a075e08cb4f1ca633f97a7ff481bfdb430f00f3458e46: Status 404 returned error can't find the container with id 34f4c219cb6478eb021a075e08cb4f1ca633f97a7ff481bfdb430f00f3458e46 Dec 01 20:10:37 crc kubenswrapper[4802]: I1201 20:10:37.590516 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7" event={"ID":"cbe6c903-91e9-4c8d-b378-4be5220c8ab0","Type":"ContainerStarted","Data":"34f4c219cb6478eb021a075e08cb4f1ca633f97a7ff481bfdb430f00f3458e46"} Dec 01 20:10:38 crc kubenswrapper[4802]: I1201 20:10:38.596622 4802 generic.go:334] "Generic (PLEG): container finished" podID="cbe6c903-91e9-4c8d-b378-4be5220c8ab0" containerID="8d22913c3b6e025ae17035bc5e7dab406b495a6ac7e7ad8fc8103b5b694c837b" exitCode=0 Dec 01 20:10:38 crc kubenswrapper[4802]: I1201 20:10:38.596658 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7" event={"ID":"cbe6c903-91e9-4c8d-b378-4be5220c8ab0","Type":"ContainerDied","Data":"8d22913c3b6e025ae17035bc5e7dab406b495a6ac7e7ad8fc8103b5b694c837b"} Dec 01 20:10:38 crc kubenswrapper[4802]: I1201 20:10:38.907356 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tfjnl"] Dec 01 20:10:38 crc kubenswrapper[4802]: I1201 20:10:38.908354 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tfjnl" Dec 01 20:10:38 crc kubenswrapper[4802]: I1201 20:10:38.920325 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tfjnl"] Dec 01 20:10:39 crc kubenswrapper[4802]: I1201 20:10:39.048947 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfpfm\" (UniqueName: \"kubernetes.io/projected/e433e918-5de9-47ea-b7e8-d68ffbb1e92b-kube-api-access-kfpfm\") pod \"redhat-operators-tfjnl\" (UID: \"e433e918-5de9-47ea-b7e8-d68ffbb1e92b\") " pod="openshift-marketplace/redhat-operators-tfjnl" Dec 01 20:10:39 crc kubenswrapper[4802]: I1201 20:10:39.049302 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e433e918-5de9-47ea-b7e8-d68ffbb1e92b-utilities\") pod \"redhat-operators-tfjnl\" (UID: \"e433e918-5de9-47ea-b7e8-d68ffbb1e92b\") " pod="openshift-marketplace/redhat-operators-tfjnl" Dec 01 20:10:39 crc kubenswrapper[4802]: I1201 20:10:39.049424 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e433e918-5de9-47ea-b7e8-d68ffbb1e92b-catalog-content\") pod \"redhat-operators-tfjnl\" (UID: \"e433e918-5de9-47ea-b7e8-d68ffbb1e92b\") " pod="openshift-marketplace/redhat-operators-tfjnl" Dec 01 20:10:39 crc kubenswrapper[4802]: I1201 20:10:39.150097 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfpfm\" (UniqueName: \"kubernetes.io/projected/e433e918-5de9-47ea-b7e8-d68ffbb1e92b-kube-api-access-kfpfm\") pod \"redhat-operators-tfjnl\" (UID: \"e433e918-5de9-47ea-b7e8-d68ffbb1e92b\") " pod="openshift-marketplace/redhat-operators-tfjnl" Dec 01 20:10:39 crc kubenswrapper[4802]: I1201 20:10:39.150145 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e433e918-5de9-47ea-b7e8-d68ffbb1e92b-utilities\") pod \"redhat-operators-tfjnl\" (UID: \"e433e918-5de9-47ea-b7e8-d68ffbb1e92b\") " pod="openshift-marketplace/redhat-operators-tfjnl" Dec 01 20:10:39 crc kubenswrapper[4802]: I1201 20:10:39.150235 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e433e918-5de9-47ea-b7e8-d68ffbb1e92b-catalog-content\") pod \"redhat-operators-tfjnl\" (UID: \"e433e918-5de9-47ea-b7e8-d68ffbb1e92b\") " pod="openshift-marketplace/redhat-operators-tfjnl" Dec 01 20:10:39 crc kubenswrapper[4802]: I1201 20:10:39.150704 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e433e918-5de9-47ea-b7e8-d68ffbb1e92b-catalog-content\") pod \"redhat-operators-tfjnl\" (UID: \"e433e918-5de9-47ea-b7e8-d68ffbb1e92b\") " pod="openshift-marketplace/redhat-operators-tfjnl" Dec 01 20:10:39 crc kubenswrapper[4802]: I1201 20:10:39.150770 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e433e918-5de9-47ea-b7e8-d68ffbb1e92b-utilities\") pod \"redhat-operators-tfjnl\" (UID: \"e433e918-5de9-47ea-b7e8-d68ffbb1e92b\") " pod="openshift-marketplace/redhat-operators-tfjnl" Dec 01 20:10:39 crc kubenswrapper[4802]: I1201 20:10:39.176070 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfpfm\" (UniqueName: \"kubernetes.io/projected/e433e918-5de9-47ea-b7e8-d68ffbb1e92b-kube-api-access-kfpfm\") pod \"redhat-operators-tfjnl\" (UID: \"e433e918-5de9-47ea-b7e8-d68ffbb1e92b\") " pod="openshift-marketplace/redhat-operators-tfjnl" Dec 01 20:10:39 crc kubenswrapper[4802]: I1201 20:10:39.226260 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tfjnl" Dec 01 20:10:39 crc kubenswrapper[4802]: I1201 20:10:39.412010 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tfjnl"] Dec 01 20:10:39 crc kubenswrapper[4802]: W1201 20:10:39.418500 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode433e918_5de9_47ea_b7e8_d68ffbb1e92b.slice/crio-e77ed90861e0f1cf95f7027bf2d311855b488e219a1a6520878bac4f3fa9afa4 WatchSource:0}: Error finding container e77ed90861e0f1cf95f7027bf2d311855b488e219a1a6520878bac4f3fa9afa4: Status 404 returned error can't find the container with id e77ed90861e0f1cf95f7027bf2d311855b488e219a1a6520878bac4f3fa9afa4 Dec 01 20:10:39 crc kubenswrapper[4802]: I1201 20:10:39.602581 4802 generic.go:334] "Generic (PLEG): container finished" podID="e433e918-5de9-47ea-b7e8-d68ffbb1e92b" containerID="7c1f2f881a34887e5116291686eff890182e65925f7145742ceda45cfef1e146" exitCode=0 Dec 01 20:10:39 crc kubenswrapper[4802]: I1201 20:10:39.602936 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tfjnl" event={"ID":"e433e918-5de9-47ea-b7e8-d68ffbb1e92b","Type":"ContainerDied","Data":"7c1f2f881a34887e5116291686eff890182e65925f7145742ceda45cfef1e146"} Dec 01 20:10:39 crc kubenswrapper[4802]: I1201 20:10:39.603117 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tfjnl" event={"ID":"e433e918-5de9-47ea-b7e8-d68ffbb1e92b","Type":"ContainerStarted","Data":"e77ed90861e0f1cf95f7027bf2d311855b488e219a1a6520878bac4f3fa9afa4"} Dec 01 20:10:40 crc kubenswrapper[4802]: I1201 20:10:40.613188 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tfjnl" event={"ID":"e433e918-5de9-47ea-b7e8-d68ffbb1e92b","Type":"ContainerStarted","Data":"65e264290b2f063f6d509080b6b7805f3912f1ea3dcbae7c0c8023adc2d39ce9"} Dec 01 20:10:40 crc kubenswrapper[4802]: I1201 20:10:40.617816 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7" event={"ID":"cbe6c903-91e9-4c8d-b378-4be5220c8ab0","Type":"ContainerStarted","Data":"b7d171171a5621a6a25a0869ba3690e757441d7c27b13c174eca26bfebacb4ce"} Dec 01 20:10:41 crc kubenswrapper[4802]: I1201 20:10:41.629071 4802 generic.go:334] "Generic (PLEG): container finished" podID="e433e918-5de9-47ea-b7e8-d68ffbb1e92b" containerID="65e264290b2f063f6d509080b6b7805f3912f1ea3dcbae7c0c8023adc2d39ce9" exitCode=0 Dec 01 20:10:41 crc kubenswrapper[4802]: I1201 20:10:41.629164 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tfjnl" event={"ID":"e433e918-5de9-47ea-b7e8-d68ffbb1e92b","Type":"ContainerDied","Data":"65e264290b2f063f6d509080b6b7805f3912f1ea3dcbae7c0c8023adc2d39ce9"} Dec 01 20:10:41 crc kubenswrapper[4802]: I1201 20:10:41.631875 4802 generic.go:334] "Generic (PLEG): container finished" podID="cbe6c903-91e9-4c8d-b378-4be5220c8ab0" containerID="b7d171171a5621a6a25a0869ba3690e757441d7c27b13c174eca26bfebacb4ce" exitCode=0 Dec 01 20:10:41 crc kubenswrapper[4802]: I1201 20:10:41.631939 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7" event={"ID":"cbe6c903-91e9-4c8d-b378-4be5220c8ab0","Type":"ContainerDied","Data":"b7d171171a5621a6a25a0869ba3690e757441d7c27b13c174eca26bfebacb4ce"} Dec 01 20:10:42 crc kubenswrapper[4802]: I1201 20:10:42.641478 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tfjnl" event={"ID":"e433e918-5de9-47ea-b7e8-d68ffbb1e92b","Type":"ContainerStarted","Data":"70d9f408d4ba644c5dc8269e55a1d3be4ed24512a68520feebdb98ffc865bd67"} Dec 01 20:10:42 crc kubenswrapper[4802]: I1201 20:10:42.644931 4802 generic.go:334] "Generic (PLEG): container finished" podID="cbe6c903-91e9-4c8d-b378-4be5220c8ab0" containerID="d3deb2ce54efd0478a30af74a77cb5dd2f646d893973dc0ed5e15ea2a7019208" exitCode=0 Dec 01 20:10:42 crc kubenswrapper[4802]: I1201 20:10:42.644970 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7" event={"ID":"cbe6c903-91e9-4c8d-b378-4be5220c8ab0","Type":"ContainerDied","Data":"d3deb2ce54efd0478a30af74a77cb5dd2f646d893973dc0ed5e15ea2a7019208"} Dec 01 20:10:42 crc kubenswrapper[4802]: I1201 20:10:42.661352 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tfjnl" podStartSLOduration=1.992917227 podStartE2EDuration="4.661334926s" podCreationTimestamp="2025-12-01 20:10:38 +0000 UTC" firstStartedPulling="2025-12-01 20:10:39.604283939 +0000 UTC m=+861.166843580" lastFinishedPulling="2025-12-01 20:10:42.272701588 +0000 UTC m=+863.835261279" observedRunningTime="2025-12-01 20:10:42.657640031 +0000 UTC m=+864.220199672" watchObservedRunningTime="2025-12-01 20:10:42.661334926 +0000 UTC m=+864.223894567" Dec 01 20:10:43 crc kubenswrapper[4802]: I1201 20:10:43.951982 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7" Dec 01 20:10:44 crc kubenswrapper[4802]: I1201 20:10:44.019074 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbe6c903-91e9-4c8d-b378-4be5220c8ab0-util\") pod \"cbe6c903-91e9-4c8d-b378-4be5220c8ab0\" (UID: \"cbe6c903-91e9-4c8d-b378-4be5220c8ab0\") " Dec 01 20:10:44 crc kubenswrapper[4802]: I1201 20:10:44.019187 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvwcb\" (UniqueName: \"kubernetes.io/projected/cbe6c903-91e9-4c8d-b378-4be5220c8ab0-kube-api-access-vvwcb\") pod \"cbe6c903-91e9-4c8d-b378-4be5220c8ab0\" (UID: \"cbe6c903-91e9-4c8d-b378-4be5220c8ab0\") " Dec 01 20:10:44 crc kubenswrapper[4802]: I1201 20:10:44.019292 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbe6c903-91e9-4c8d-b378-4be5220c8ab0-bundle\") pod \"cbe6c903-91e9-4c8d-b378-4be5220c8ab0\" (UID: \"cbe6c903-91e9-4c8d-b378-4be5220c8ab0\") " Dec 01 20:10:44 crc kubenswrapper[4802]: I1201 20:10:44.020122 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbe6c903-91e9-4c8d-b378-4be5220c8ab0-bundle" (OuterVolumeSpecName: "bundle") pod "cbe6c903-91e9-4c8d-b378-4be5220c8ab0" (UID: "cbe6c903-91e9-4c8d-b378-4be5220c8ab0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:10:44 crc kubenswrapper[4802]: I1201 20:10:44.026477 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbe6c903-91e9-4c8d-b378-4be5220c8ab0-kube-api-access-vvwcb" (OuterVolumeSpecName: "kube-api-access-vvwcb") pod "cbe6c903-91e9-4c8d-b378-4be5220c8ab0" (UID: "cbe6c903-91e9-4c8d-b378-4be5220c8ab0"). InnerVolumeSpecName "kube-api-access-vvwcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:10:44 crc kubenswrapper[4802]: I1201 20:10:44.033740 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbe6c903-91e9-4c8d-b378-4be5220c8ab0-util" (OuterVolumeSpecName: "util") pod "cbe6c903-91e9-4c8d-b378-4be5220c8ab0" (UID: "cbe6c903-91e9-4c8d-b378-4be5220c8ab0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:10:44 crc kubenswrapper[4802]: I1201 20:10:44.121177 4802 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbe6c903-91e9-4c8d-b378-4be5220c8ab0-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:10:44 crc kubenswrapper[4802]: I1201 20:10:44.121228 4802 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbe6c903-91e9-4c8d-b378-4be5220c8ab0-util\") on node \"crc\" DevicePath \"\"" Dec 01 20:10:44 crc kubenswrapper[4802]: I1201 20:10:44.121237 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvwcb\" (UniqueName: \"kubernetes.io/projected/cbe6c903-91e9-4c8d-b378-4be5220c8ab0-kube-api-access-vvwcb\") on node \"crc\" DevicePath \"\"" Dec 01 20:10:44 crc kubenswrapper[4802]: I1201 20:10:44.662821 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7" event={"ID":"cbe6c903-91e9-4c8d-b378-4be5220c8ab0","Type":"ContainerDied","Data":"34f4c219cb6478eb021a075e08cb4f1ca633f97a7ff481bfdb430f00f3458e46"} Dec 01 20:10:44 crc kubenswrapper[4802]: I1201 20:10:44.662879 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34f4c219cb6478eb021a075e08cb4f1ca633f97a7ff481bfdb430f00f3458e46" Dec 01 20:10:44 crc kubenswrapper[4802]: I1201 20:10:44.662920 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7" Dec 01 20:10:46 crc kubenswrapper[4802]: I1201 20:10:46.634580 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-n9pjm"] Dec 01 20:10:46 crc kubenswrapper[4802]: E1201 20:10:46.634991 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe6c903-91e9-4c8d-b378-4be5220c8ab0" containerName="extract" Dec 01 20:10:46 crc kubenswrapper[4802]: I1201 20:10:46.635003 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe6c903-91e9-4c8d-b378-4be5220c8ab0" containerName="extract" Dec 01 20:10:46 crc kubenswrapper[4802]: E1201 20:10:46.635014 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe6c903-91e9-4c8d-b378-4be5220c8ab0" containerName="util" Dec 01 20:10:46 crc kubenswrapper[4802]: I1201 20:10:46.635021 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe6c903-91e9-4c8d-b378-4be5220c8ab0" containerName="util" Dec 01 20:10:46 crc kubenswrapper[4802]: E1201 20:10:46.635031 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe6c903-91e9-4c8d-b378-4be5220c8ab0" containerName="pull" Dec 01 20:10:46 crc kubenswrapper[4802]: I1201 20:10:46.635037 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe6c903-91e9-4c8d-b378-4be5220c8ab0" containerName="pull" Dec 01 20:10:46 crc kubenswrapper[4802]: I1201 20:10:46.635131 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbe6c903-91e9-4c8d-b378-4be5220c8ab0" containerName="extract" Dec 01 20:10:46 crc kubenswrapper[4802]: I1201 20:10:46.635501 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-n9pjm" Dec 01 20:10:46 crc kubenswrapper[4802]: I1201 20:10:46.638399 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-pznm8" Dec 01 20:10:46 crc kubenswrapper[4802]: I1201 20:10:46.638708 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 01 20:10:46 crc kubenswrapper[4802]: I1201 20:10:46.638887 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 01 20:10:46 crc kubenswrapper[4802]: I1201 20:10:46.653102 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-n9pjm"] Dec 01 20:10:46 crc kubenswrapper[4802]: I1201 20:10:46.754867 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4x8x\" (UniqueName: \"kubernetes.io/projected/75a57a90-06f5-444e-897d-1191d7838e8b-kube-api-access-r4x8x\") pod \"nmstate-operator-5b5b58f5c8-n9pjm\" (UID: \"75a57a90-06f5-444e-897d-1191d7838e8b\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-n9pjm" Dec 01 20:10:46 crc kubenswrapper[4802]: I1201 20:10:46.855915 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4x8x\" (UniqueName: \"kubernetes.io/projected/75a57a90-06f5-444e-897d-1191d7838e8b-kube-api-access-r4x8x\") pod \"nmstate-operator-5b5b58f5c8-n9pjm\" (UID: \"75a57a90-06f5-444e-897d-1191d7838e8b\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-n9pjm" Dec 01 20:10:46 crc kubenswrapper[4802]: I1201 20:10:46.873432 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4x8x\" (UniqueName: \"kubernetes.io/projected/75a57a90-06f5-444e-897d-1191d7838e8b-kube-api-access-r4x8x\") pod \"nmstate-operator-5b5b58f5c8-n9pjm\" (UID: \"75a57a90-06f5-444e-897d-1191d7838e8b\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-n9pjm" Dec 01 20:10:46 crc kubenswrapper[4802]: I1201 20:10:46.950843 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-n9pjm" Dec 01 20:10:47 crc kubenswrapper[4802]: I1201 20:10:47.151811 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-n9pjm"] Dec 01 20:10:47 crc kubenswrapper[4802]: W1201 20:10:47.156310 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75a57a90_06f5_444e_897d_1191d7838e8b.slice/crio-a3959a290aba70d3e717bc6b1915e826b08abbf27f932ad3cb83066d49278219 WatchSource:0}: Error finding container a3959a290aba70d3e717bc6b1915e826b08abbf27f932ad3cb83066d49278219: Status 404 returned error can't find the container with id a3959a290aba70d3e717bc6b1915e826b08abbf27f932ad3cb83066d49278219 Dec 01 20:10:47 crc kubenswrapper[4802]: I1201 20:10:47.678867 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-n9pjm" event={"ID":"75a57a90-06f5-444e-897d-1191d7838e8b","Type":"ContainerStarted","Data":"a3959a290aba70d3e717bc6b1915e826b08abbf27f932ad3cb83066d49278219"} Dec 01 20:10:49 crc kubenswrapper[4802]: I1201 20:10:49.226420 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tfjnl" Dec 01 20:10:49 crc kubenswrapper[4802]: I1201 20:10:49.226775 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tfjnl" Dec 01 20:10:49 crc kubenswrapper[4802]: I1201 20:10:49.270894 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tfjnl" Dec 01 20:10:49 crc kubenswrapper[4802]: I1201 20:10:49.734253 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tfjnl" Dec 01 20:10:51 crc kubenswrapper[4802]: I1201 20:10:51.497850 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tfjnl"] Dec 01 20:10:51 crc kubenswrapper[4802]: I1201 20:10:51.707404 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-n9pjm" event={"ID":"75a57a90-06f5-444e-897d-1191d7838e8b","Type":"ContainerStarted","Data":"193df7b5755f6e5e55c54cfa230f34d60903c04468f3bfc3011f5d58de18d459"} Dec 01 20:10:51 crc kubenswrapper[4802]: I1201 20:10:51.707731 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tfjnl" podUID="e433e918-5de9-47ea-b7e8-d68ffbb1e92b" containerName="registry-server" containerID="cri-o://70d9f408d4ba644c5dc8269e55a1d3be4ed24512a68520feebdb98ffc865bd67" gracePeriod=2 Dec 01 20:10:51 crc kubenswrapper[4802]: I1201 20:10:51.729606 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-n9pjm" podStartSLOduration=2.399125237 podStartE2EDuration="5.72957832s" podCreationTimestamp="2025-12-01 20:10:46 +0000 UTC" firstStartedPulling="2025-12-01 20:10:47.158629045 +0000 UTC m=+868.721188686" lastFinishedPulling="2025-12-01 20:10:50.489082128 +0000 UTC m=+872.051641769" observedRunningTime="2025-12-01 20:10:51.726640848 +0000 UTC m=+873.289200489" watchObservedRunningTime="2025-12-01 20:10:51.72957832 +0000 UTC m=+873.292137971" Dec 01 20:10:52 crc kubenswrapper[4802]: I1201 20:10:52.393606 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tfjnl" Dec 01 20:10:52 crc kubenswrapper[4802]: I1201 20:10:52.526732 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e433e918-5de9-47ea-b7e8-d68ffbb1e92b-catalog-content\") pod \"e433e918-5de9-47ea-b7e8-d68ffbb1e92b\" (UID: \"e433e918-5de9-47ea-b7e8-d68ffbb1e92b\") " Dec 01 20:10:52 crc kubenswrapper[4802]: I1201 20:10:52.526768 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e433e918-5de9-47ea-b7e8-d68ffbb1e92b-utilities\") pod \"e433e918-5de9-47ea-b7e8-d68ffbb1e92b\" (UID: \"e433e918-5de9-47ea-b7e8-d68ffbb1e92b\") " Dec 01 20:10:52 crc kubenswrapper[4802]: I1201 20:10:52.526801 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfpfm\" (UniqueName: \"kubernetes.io/projected/e433e918-5de9-47ea-b7e8-d68ffbb1e92b-kube-api-access-kfpfm\") pod \"e433e918-5de9-47ea-b7e8-d68ffbb1e92b\" (UID: \"e433e918-5de9-47ea-b7e8-d68ffbb1e92b\") " Dec 01 20:10:52 crc kubenswrapper[4802]: I1201 20:10:52.528542 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e433e918-5de9-47ea-b7e8-d68ffbb1e92b-utilities" (OuterVolumeSpecName: "utilities") pod "e433e918-5de9-47ea-b7e8-d68ffbb1e92b" (UID: "e433e918-5de9-47ea-b7e8-d68ffbb1e92b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:10:52 crc kubenswrapper[4802]: I1201 20:10:52.531959 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e433e918-5de9-47ea-b7e8-d68ffbb1e92b-kube-api-access-kfpfm" (OuterVolumeSpecName: "kube-api-access-kfpfm") pod "e433e918-5de9-47ea-b7e8-d68ffbb1e92b" (UID: "e433e918-5de9-47ea-b7e8-d68ffbb1e92b"). InnerVolumeSpecName "kube-api-access-kfpfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:10:52 crc kubenswrapper[4802]: I1201 20:10:52.628407 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfpfm\" (UniqueName: \"kubernetes.io/projected/e433e918-5de9-47ea-b7e8-d68ffbb1e92b-kube-api-access-kfpfm\") on node \"crc\" DevicePath \"\"" Dec 01 20:10:52 crc kubenswrapper[4802]: I1201 20:10:52.628448 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e433e918-5de9-47ea-b7e8-d68ffbb1e92b-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 20:10:52 crc kubenswrapper[4802]: I1201 20:10:52.655978 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e433e918-5de9-47ea-b7e8-d68ffbb1e92b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e433e918-5de9-47ea-b7e8-d68ffbb1e92b" (UID: "e433e918-5de9-47ea-b7e8-d68ffbb1e92b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:10:52 crc kubenswrapper[4802]: I1201 20:10:52.714343 4802 generic.go:334] "Generic (PLEG): container finished" podID="e433e918-5de9-47ea-b7e8-d68ffbb1e92b" containerID="70d9f408d4ba644c5dc8269e55a1d3be4ed24512a68520feebdb98ffc865bd67" exitCode=0 Dec 01 20:10:52 crc kubenswrapper[4802]: I1201 20:10:52.714403 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tfjnl" event={"ID":"e433e918-5de9-47ea-b7e8-d68ffbb1e92b","Type":"ContainerDied","Data":"70d9f408d4ba644c5dc8269e55a1d3be4ed24512a68520feebdb98ffc865bd67"} Dec 01 20:10:52 crc kubenswrapper[4802]: I1201 20:10:52.714430 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tfjnl" Dec 01 20:10:52 crc kubenswrapper[4802]: I1201 20:10:52.714451 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tfjnl" event={"ID":"e433e918-5de9-47ea-b7e8-d68ffbb1e92b","Type":"ContainerDied","Data":"e77ed90861e0f1cf95f7027bf2d311855b488e219a1a6520878bac4f3fa9afa4"} Dec 01 20:10:52 crc kubenswrapper[4802]: I1201 20:10:52.714472 4802 scope.go:117] "RemoveContainer" containerID="70d9f408d4ba644c5dc8269e55a1d3be4ed24512a68520feebdb98ffc865bd67" Dec 01 20:10:52 crc kubenswrapper[4802]: I1201 20:10:52.729223 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e433e918-5de9-47ea-b7e8-d68ffbb1e92b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 20:10:52 crc kubenswrapper[4802]: I1201 20:10:52.729764 4802 scope.go:117] "RemoveContainer" containerID="65e264290b2f063f6d509080b6b7805f3912f1ea3dcbae7c0c8023adc2d39ce9" Dec 01 20:10:52 crc kubenswrapper[4802]: I1201 20:10:52.739326 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tfjnl"] Dec 01 20:10:52 crc kubenswrapper[4802]: I1201 20:10:52.742050 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tfjnl"] Dec 01 20:10:52 crc kubenswrapper[4802]: I1201 20:10:52.758626 4802 scope.go:117] "RemoveContainer" containerID="7c1f2f881a34887e5116291686eff890182e65925f7145742ceda45cfef1e146" Dec 01 20:10:52 crc kubenswrapper[4802]: I1201 20:10:52.772428 4802 scope.go:117] "RemoveContainer" containerID="70d9f408d4ba644c5dc8269e55a1d3be4ed24512a68520feebdb98ffc865bd67" Dec 01 20:10:52 crc kubenswrapper[4802]: E1201 20:10:52.772826 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70d9f408d4ba644c5dc8269e55a1d3be4ed24512a68520feebdb98ffc865bd67\": container with ID starting with 70d9f408d4ba644c5dc8269e55a1d3be4ed24512a68520feebdb98ffc865bd67 not found: ID does not exist" containerID="70d9f408d4ba644c5dc8269e55a1d3be4ed24512a68520feebdb98ffc865bd67" Dec 01 20:10:52 crc kubenswrapper[4802]: I1201 20:10:52.772857 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70d9f408d4ba644c5dc8269e55a1d3be4ed24512a68520feebdb98ffc865bd67"} err="failed to get container status \"70d9f408d4ba644c5dc8269e55a1d3be4ed24512a68520feebdb98ffc865bd67\": rpc error: code = NotFound desc = could not find container \"70d9f408d4ba644c5dc8269e55a1d3be4ed24512a68520feebdb98ffc865bd67\": container with ID starting with 70d9f408d4ba644c5dc8269e55a1d3be4ed24512a68520feebdb98ffc865bd67 not found: ID does not exist" Dec 01 20:10:52 crc kubenswrapper[4802]: I1201 20:10:52.772878 4802 scope.go:117] "RemoveContainer" containerID="65e264290b2f063f6d509080b6b7805f3912f1ea3dcbae7c0c8023adc2d39ce9" Dec 01 20:10:52 crc kubenswrapper[4802]: E1201 20:10:52.773183 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65e264290b2f063f6d509080b6b7805f3912f1ea3dcbae7c0c8023adc2d39ce9\": container with ID starting with 65e264290b2f063f6d509080b6b7805f3912f1ea3dcbae7c0c8023adc2d39ce9 not found: ID does not exist" containerID="65e264290b2f063f6d509080b6b7805f3912f1ea3dcbae7c0c8023adc2d39ce9" Dec 01 20:10:52 crc kubenswrapper[4802]: I1201 20:10:52.773338 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65e264290b2f063f6d509080b6b7805f3912f1ea3dcbae7c0c8023adc2d39ce9"} err="failed to get container status \"65e264290b2f063f6d509080b6b7805f3912f1ea3dcbae7c0c8023adc2d39ce9\": rpc error: code = NotFound desc = could not find container \"65e264290b2f063f6d509080b6b7805f3912f1ea3dcbae7c0c8023adc2d39ce9\": container with ID starting with 65e264290b2f063f6d509080b6b7805f3912f1ea3dcbae7c0c8023adc2d39ce9 not found: ID does not exist" Dec 01 20:10:52 crc kubenswrapper[4802]: I1201 20:10:52.773432 4802 scope.go:117] "RemoveContainer" containerID="7c1f2f881a34887e5116291686eff890182e65925f7145742ceda45cfef1e146" Dec 01 20:10:52 crc kubenswrapper[4802]: E1201 20:10:52.773849 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c1f2f881a34887e5116291686eff890182e65925f7145742ceda45cfef1e146\": container with ID starting with 7c1f2f881a34887e5116291686eff890182e65925f7145742ceda45cfef1e146 not found: ID does not exist" containerID="7c1f2f881a34887e5116291686eff890182e65925f7145742ceda45cfef1e146" Dec 01 20:10:52 crc kubenswrapper[4802]: I1201 20:10:52.773879 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c1f2f881a34887e5116291686eff890182e65925f7145742ceda45cfef1e146"} err="failed to get container status \"7c1f2f881a34887e5116291686eff890182e65925f7145742ceda45cfef1e146\": rpc error: code = NotFound desc = could not find container \"7c1f2f881a34887e5116291686eff890182e65925f7145742ceda45cfef1e146\": container with ID starting with 7c1f2f881a34887e5116291686eff890182e65925f7145742ceda45cfef1e146 not found: ID does not exist" Dec 01 20:10:54 crc kubenswrapper[4802]: I1201 20:10:54.731823 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e433e918-5de9-47ea-b7e8-d68ffbb1e92b" path="/var/lib/kubelet/pods/e433e918-5de9-47ea-b7e8-d68ffbb1e92b/volumes" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.301767 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-8g77d"] Dec 01 20:10:57 crc kubenswrapper[4802]: E1201 20:10:57.302449 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e433e918-5de9-47ea-b7e8-d68ffbb1e92b" containerName="registry-server" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.302470 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="e433e918-5de9-47ea-b7e8-d68ffbb1e92b" containerName="registry-server" Dec 01 20:10:57 crc kubenswrapper[4802]: E1201 20:10:57.302486 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e433e918-5de9-47ea-b7e8-d68ffbb1e92b" containerName="extract-content" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.302497 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="e433e918-5de9-47ea-b7e8-d68ffbb1e92b" containerName="extract-content" Dec 01 20:10:57 crc kubenswrapper[4802]: E1201 20:10:57.302528 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e433e918-5de9-47ea-b7e8-d68ffbb1e92b" containerName="extract-utilities" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.302542 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="e433e918-5de9-47ea-b7e8-d68ffbb1e92b" containerName="extract-utilities" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.302723 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="e433e918-5de9-47ea-b7e8-d68ffbb1e92b" containerName="registry-server" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.303610 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8g77d" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.305837 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9kcsk"] Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.306100 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-24skv" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.306509 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9kcsk" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.313366 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.328259 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-8g77d"] Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.333079 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-pkwpd"] Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.333760 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-pkwpd" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.349779 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9kcsk"] Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.386841 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqhjp\" (UniqueName: \"kubernetes.io/projected/df60afec-8603-441f-88bb-31d054b7fea5-kube-api-access-tqhjp\") pod \"nmstate-handler-pkwpd\" (UID: \"df60afec-8603-441f-88bb-31d054b7fea5\") " pod="openshift-nmstate/nmstate-handler-pkwpd" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.386896 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b504895d-c4d4-4261-ab7d-24532e127650-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-9kcsk\" (UID: \"b504895d-c4d4-4261-ab7d-24532e127650\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9kcsk" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.386925 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zczz\" (UniqueName: \"kubernetes.io/projected/b504895d-c4d4-4261-ab7d-24532e127650-kube-api-access-9zczz\") pod \"nmstate-webhook-5f6d4c5ccb-9kcsk\" (UID: \"b504895d-c4d4-4261-ab7d-24532e127650\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9kcsk" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.386949 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5qmq\" (UniqueName: \"kubernetes.io/projected/9923b241-3a3d-4051-b5c4-6677dff519ed-kube-api-access-q5qmq\") pod \"nmstate-metrics-7f946cbc9-8g77d\" (UID: \"9923b241-3a3d-4051-b5c4-6677dff519ed\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8g77d" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.386976 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/df60afec-8603-441f-88bb-31d054b7fea5-dbus-socket\") pod \"nmstate-handler-pkwpd\" (UID: \"df60afec-8603-441f-88bb-31d054b7fea5\") " pod="openshift-nmstate/nmstate-handler-pkwpd" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.387001 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/df60afec-8603-441f-88bb-31d054b7fea5-ovs-socket\") pod \"nmstate-handler-pkwpd\" (UID: \"df60afec-8603-441f-88bb-31d054b7fea5\") " pod="openshift-nmstate/nmstate-handler-pkwpd" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.387025 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/df60afec-8603-441f-88bb-31d054b7fea5-nmstate-lock\") pod \"nmstate-handler-pkwpd\" (UID: \"df60afec-8603-441f-88bb-31d054b7fea5\") " pod="openshift-nmstate/nmstate-handler-pkwpd" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.435412 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-tlx97"] Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.436281 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-tlx97" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.439512 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.439603 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-bjtq5" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.440544 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.454835 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-tlx97"] Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.488278 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8a21a68c-7399-4632-9564-1c0650125ea5-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-tlx97\" (UID: \"8a21a68c-7399-4632-9564-1c0650125ea5\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-tlx97" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.488459 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqhjp\" (UniqueName: \"kubernetes.io/projected/df60afec-8603-441f-88bb-31d054b7fea5-kube-api-access-tqhjp\") pod \"nmstate-handler-pkwpd\" (UID: \"df60afec-8603-441f-88bb-31d054b7fea5\") " pod="openshift-nmstate/nmstate-handler-pkwpd" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.488570 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a21a68c-7399-4632-9564-1c0650125ea5-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-tlx97\" (UID: \"8a21a68c-7399-4632-9564-1c0650125ea5\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-tlx97" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.488609 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b504895d-c4d4-4261-ab7d-24532e127650-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-9kcsk\" (UID: \"b504895d-c4d4-4261-ab7d-24532e127650\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9kcsk" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.488652 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zczz\" (UniqueName: \"kubernetes.io/projected/b504895d-c4d4-4261-ab7d-24532e127650-kube-api-access-9zczz\") pod \"nmstate-webhook-5f6d4c5ccb-9kcsk\" (UID: \"b504895d-c4d4-4261-ab7d-24532e127650\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9kcsk" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.488680 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5qmq\" (UniqueName: \"kubernetes.io/projected/9923b241-3a3d-4051-b5c4-6677dff519ed-kube-api-access-q5qmq\") pod \"nmstate-metrics-7f946cbc9-8g77d\" (UID: \"9923b241-3a3d-4051-b5c4-6677dff519ed\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8g77d" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.488708 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdtpp\" (UniqueName: \"kubernetes.io/projected/8a21a68c-7399-4632-9564-1c0650125ea5-kube-api-access-sdtpp\") pod \"nmstate-console-plugin-7fbb5f6569-tlx97\" (UID: \"8a21a68c-7399-4632-9564-1c0650125ea5\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-tlx97" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.488737 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/df60afec-8603-441f-88bb-31d054b7fea5-dbus-socket\") pod \"nmstate-handler-pkwpd\" (UID: \"df60afec-8603-441f-88bb-31d054b7fea5\") " pod="openshift-nmstate/nmstate-handler-pkwpd" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.488788 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/df60afec-8603-441f-88bb-31d054b7fea5-ovs-socket\") pod \"nmstate-handler-pkwpd\" (UID: \"df60afec-8603-441f-88bb-31d054b7fea5\") " pod="openshift-nmstate/nmstate-handler-pkwpd" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.488846 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/df60afec-8603-441f-88bb-31d054b7fea5-nmstate-lock\") pod \"nmstate-handler-pkwpd\" (UID: \"df60afec-8603-441f-88bb-31d054b7fea5\") " pod="openshift-nmstate/nmstate-handler-pkwpd" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.488943 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/df60afec-8603-441f-88bb-31d054b7fea5-nmstate-lock\") pod \"nmstate-handler-pkwpd\" (UID: \"df60afec-8603-441f-88bb-31d054b7fea5\") " pod="openshift-nmstate/nmstate-handler-pkwpd" Dec 01 20:10:57 crc kubenswrapper[4802]: E1201 20:10:57.489411 4802 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 01 20:10:57 crc kubenswrapper[4802]: E1201 20:10:57.489458 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b504895d-c4d4-4261-ab7d-24532e127650-tls-key-pair podName:b504895d-c4d4-4261-ab7d-24532e127650 nodeName:}" failed. No retries permitted until 2025-12-01 20:10:57.989441078 +0000 UTC m=+879.552000719 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/b504895d-c4d4-4261-ab7d-24532e127650-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-9kcsk" (UID: "b504895d-c4d4-4261-ab7d-24532e127650") : secret "openshift-nmstate-webhook" not found Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.490099 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/df60afec-8603-441f-88bb-31d054b7fea5-dbus-socket\") pod \"nmstate-handler-pkwpd\" (UID: \"df60afec-8603-441f-88bb-31d054b7fea5\") " pod="openshift-nmstate/nmstate-handler-pkwpd" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.490140 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/df60afec-8603-441f-88bb-31d054b7fea5-ovs-socket\") pod \"nmstate-handler-pkwpd\" (UID: \"df60afec-8603-441f-88bb-31d054b7fea5\") " pod="openshift-nmstate/nmstate-handler-pkwpd" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.509937 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5qmq\" (UniqueName: \"kubernetes.io/projected/9923b241-3a3d-4051-b5c4-6677dff519ed-kube-api-access-q5qmq\") pod \"nmstate-metrics-7f946cbc9-8g77d\" (UID: \"9923b241-3a3d-4051-b5c4-6677dff519ed\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8g77d" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.512919 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zczz\" (UniqueName: \"kubernetes.io/projected/b504895d-c4d4-4261-ab7d-24532e127650-kube-api-access-9zczz\") pod \"nmstate-webhook-5f6d4c5ccb-9kcsk\" (UID: \"b504895d-c4d4-4261-ab7d-24532e127650\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9kcsk" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.514023 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqhjp\" (UniqueName: \"kubernetes.io/projected/df60afec-8603-441f-88bb-31d054b7fea5-kube-api-access-tqhjp\") pod \"nmstate-handler-pkwpd\" (UID: \"df60afec-8603-441f-88bb-31d054b7fea5\") " pod="openshift-nmstate/nmstate-handler-pkwpd" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.590162 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a21a68c-7399-4632-9564-1c0650125ea5-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-tlx97\" (UID: \"8a21a68c-7399-4632-9564-1c0650125ea5\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-tlx97" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.590248 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdtpp\" (UniqueName: \"kubernetes.io/projected/8a21a68c-7399-4632-9564-1c0650125ea5-kube-api-access-sdtpp\") pod \"nmstate-console-plugin-7fbb5f6569-tlx97\" (UID: \"8a21a68c-7399-4632-9564-1c0650125ea5\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-tlx97" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.590292 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8a21a68c-7399-4632-9564-1c0650125ea5-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-tlx97\" (UID: \"8a21a68c-7399-4632-9564-1c0650125ea5\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-tlx97" Dec 01 20:10:57 crc kubenswrapper[4802]: E1201 20:10:57.590335 4802 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 01 20:10:57 crc kubenswrapper[4802]: E1201 20:10:57.590402 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a21a68c-7399-4632-9564-1c0650125ea5-plugin-serving-cert podName:8a21a68c-7399-4632-9564-1c0650125ea5 nodeName:}" failed. No retries permitted until 2025-12-01 20:10:58.090383736 +0000 UTC m=+879.652943377 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/8a21a68c-7399-4632-9564-1c0650125ea5-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-tlx97" (UID: "8a21a68c-7399-4632-9564-1c0650125ea5") : secret "plugin-serving-cert" not found Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.591163 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8a21a68c-7399-4632-9564-1c0650125ea5-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-tlx97\" (UID: \"8a21a68c-7399-4632-9564-1c0650125ea5\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-tlx97" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.619610 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdtpp\" (UniqueName: \"kubernetes.io/projected/8a21a68c-7399-4632-9564-1c0650125ea5-kube-api-access-sdtpp\") pod \"nmstate-console-plugin-7fbb5f6569-tlx97\" (UID: \"8a21a68c-7399-4632-9564-1c0650125ea5\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-tlx97" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.627913 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8g77d" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.628077 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7955fc94f9-7vczf"] Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.628755 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7955fc94f9-7vczf" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.642664 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7955fc94f9-7vczf"] Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.658158 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-pkwpd" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.691308 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/562ca388-2e12-4577-95c5-d5307d12de5b-service-ca\") pod \"console-7955fc94f9-7vczf\" (UID: \"562ca388-2e12-4577-95c5-d5307d12de5b\") " pod="openshift-console/console-7955fc94f9-7vczf" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.691359 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/562ca388-2e12-4577-95c5-d5307d12de5b-oauth-serving-cert\") pod \"console-7955fc94f9-7vczf\" (UID: \"562ca388-2e12-4577-95c5-d5307d12de5b\") " pod="openshift-console/console-7955fc94f9-7vczf" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.691388 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/562ca388-2e12-4577-95c5-d5307d12de5b-console-serving-cert\") pod \"console-7955fc94f9-7vczf\" (UID: \"562ca388-2e12-4577-95c5-d5307d12de5b\") " pod="openshift-console/console-7955fc94f9-7vczf" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.691418 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/562ca388-2e12-4577-95c5-d5307d12de5b-trusted-ca-bundle\") pod \"console-7955fc94f9-7vczf\" (UID: \"562ca388-2e12-4577-95c5-d5307d12de5b\") " pod="openshift-console/console-7955fc94f9-7vczf" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.691516 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/562ca388-2e12-4577-95c5-d5307d12de5b-console-config\") pod \"console-7955fc94f9-7vczf\" (UID: \"562ca388-2e12-4577-95c5-d5307d12de5b\") " pod="openshift-console/console-7955fc94f9-7vczf" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.691688 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/562ca388-2e12-4577-95c5-d5307d12de5b-console-oauth-config\") pod \"console-7955fc94f9-7vczf\" (UID: \"562ca388-2e12-4577-95c5-d5307d12de5b\") " pod="openshift-console/console-7955fc94f9-7vczf" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.691810 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7th6g\" (UniqueName: \"kubernetes.io/projected/562ca388-2e12-4577-95c5-d5307d12de5b-kube-api-access-7th6g\") pod \"console-7955fc94f9-7vczf\" (UID: \"562ca388-2e12-4577-95c5-d5307d12de5b\") " pod="openshift-console/console-7955fc94f9-7vczf" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.753480 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-pkwpd" event={"ID":"df60afec-8603-441f-88bb-31d054b7fea5","Type":"ContainerStarted","Data":"8f6e442fa622d7b7a287db5f1aa888567e36cc2d64843026d343d200d323f945"} Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.792967 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/562ca388-2e12-4577-95c5-d5307d12de5b-service-ca\") pod \"console-7955fc94f9-7vczf\" (UID: \"562ca388-2e12-4577-95c5-d5307d12de5b\") " pod="openshift-console/console-7955fc94f9-7vczf" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.793041 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/562ca388-2e12-4577-95c5-d5307d12de5b-oauth-serving-cert\") pod \"console-7955fc94f9-7vczf\" (UID: \"562ca388-2e12-4577-95c5-d5307d12de5b\") " pod="openshift-console/console-7955fc94f9-7vczf" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.793076 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/562ca388-2e12-4577-95c5-d5307d12de5b-console-serving-cert\") pod \"console-7955fc94f9-7vczf\" (UID: \"562ca388-2e12-4577-95c5-d5307d12de5b\") " pod="openshift-console/console-7955fc94f9-7vczf" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.793128 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/562ca388-2e12-4577-95c5-d5307d12de5b-trusted-ca-bundle\") pod \"console-7955fc94f9-7vczf\" (UID: \"562ca388-2e12-4577-95c5-d5307d12de5b\") " pod="openshift-console/console-7955fc94f9-7vczf" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.793155 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/562ca388-2e12-4577-95c5-d5307d12de5b-console-config\") pod \"console-7955fc94f9-7vczf\" (UID: \"562ca388-2e12-4577-95c5-d5307d12de5b\") " pod="openshift-console/console-7955fc94f9-7vczf" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.793285 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/562ca388-2e12-4577-95c5-d5307d12de5b-console-oauth-config\") pod \"console-7955fc94f9-7vczf\" (UID: \"562ca388-2e12-4577-95c5-d5307d12de5b\") " pod="openshift-console/console-7955fc94f9-7vczf" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.793459 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7th6g\" (UniqueName: \"kubernetes.io/projected/562ca388-2e12-4577-95c5-d5307d12de5b-kube-api-access-7th6g\") pod \"console-7955fc94f9-7vczf\" (UID: \"562ca388-2e12-4577-95c5-d5307d12de5b\") " pod="openshift-console/console-7955fc94f9-7vczf" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.794839 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/562ca388-2e12-4577-95c5-d5307d12de5b-service-ca\") pod \"console-7955fc94f9-7vczf\" (UID: \"562ca388-2e12-4577-95c5-d5307d12de5b\") " pod="openshift-console/console-7955fc94f9-7vczf" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.795890 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/562ca388-2e12-4577-95c5-d5307d12de5b-oauth-serving-cert\") pod \"console-7955fc94f9-7vczf\" (UID: \"562ca388-2e12-4577-95c5-d5307d12de5b\") " pod="openshift-console/console-7955fc94f9-7vczf" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.798638 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/562ca388-2e12-4577-95c5-d5307d12de5b-console-config\") pod \"console-7955fc94f9-7vczf\" (UID: \"562ca388-2e12-4577-95c5-d5307d12de5b\") " pod="openshift-console/console-7955fc94f9-7vczf" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.799011 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/562ca388-2e12-4577-95c5-d5307d12de5b-trusted-ca-bundle\") pod \"console-7955fc94f9-7vczf\" (UID: \"562ca388-2e12-4577-95c5-d5307d12de5b\") " pod="openshift-console/console-7955fc94f9-7vczf" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.802600 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/562ca388-2e12-4577-95c5-d5307d12de5b-console-oauth-config\") pod \"console-7955fc94f9-7vczf\" (UID: \"562ca388-2e12-4577-95c5-d5307d12de5b\") " pod="openshift-console/console-7955fc94f9-7vczf" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.803720 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/562ca388-2e12-4577-95c5-d5307d12de5b-console-serving-cert\") pod \"console-7955fc94f9-7vczf\" (UID: \"562ca388-2e12-4577-95c5-d5307d12de5b\") " pod="openshift-console/console-7955fc94f9-7vczf" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.820046 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7th6g\" (UniqueName: \"kubernetes.io/projected/562ca388-2e12-4577-95c5-d5307d12de5b-kube-api-access-7th6g\") pod \"console-7955fc94f9-7vczf\" (UID: \"562ca388-2e12-4577-95c5-d5307d12de5b\") " pod="openshift-console/console-7955fc94f9-7vczf" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.988551 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7955fc94f9-7vczf" Dec 01 20:10:57 crc kubenswrapper[4802]: I1201 20:10:57.995802 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b504895d-c4d4-4261-ab7d-24532e127650-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-9kcsk\" (UID: \"b504895d-c4d4-4261-ab7d-24532e127650\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9kcsk" Dec 01 20:10:58 crc kubenswrapper[4802]: I1201 20:10:58.001055 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b504895d-c4d4-4261-ab7d-24532e127650-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-9kcsk\" (UID: \"b504895d-c4d4-4261-ab7d-24532e127650\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9kcsk" Dec 01 20:10:58 crc kubenswrapper[4802]: I1201 20:10:58.033268 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-8g77d"] Dec 01 20:10:58 crc kubenswrapper[4802]: W1201 20:10:58.038288 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9923b241_3a3d_4051_b5c4_6677dff519ed.slice/crio-0356df63bd7cd91ebc9f9c6c892533a43465dca8f007cce4f56ab3ef04ab4a3c WatchSource:0}: Error finding container 0356df63bd7cd91ebc9f9c6c892533a43465dca8f007cce4f56ab3ef04ab4a3c: Status 404 returned error can't find the container with id 0356df63bd7cd91ebc9f9c6c892533a43465dca8f007cce4f56ab3ef04ab4a3c Dec 01 20:10:58 crc kubenswrapper[4802]: I1201 20:10:58.097606 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a21a68c-7399-4632-9564-1c0650125ea5-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-tlx97\" (UID: \"8a21a68c-7399-4632-9564-1c0650125ea5\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-tlx97" Dec 01 20:10:58 crc kubenswrapper[4802]: I1201 20:10:58.101720 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a21a68c-7399-4632-9564-1c0650125ea5-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-tlx97\" (UID: \"8a21a68c-7399-4632-9564-1c0650125ea5\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-tlx97" Dec 01 20:10:58 crc kubenswrapper[4802]: I1201 20:10:58.197451 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7955fc94f9-7vczf"] Dec 01 20:10:58 crc kubenswrapper[4802]: W1201 20:10:58.202908 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod562ca388_2e12_4577_95c5_d5307d12de5b.slice/crio-928aafdc03df4697bddf8c3a409bd11d0dce1b868d9a9e967fbf4ab27f8a3b65 WatchSource:0}: Error finding container 928aafdc03df4697bddf8c3a409bd11d0dce1b868d9a9e967fbf4ab27f8a3b65: Status 404 returned error can't find the container with id 928aafdc03df4697bddf8c3a409bd11d0dce1b868d9a9e967fbf4ab27f8a3b65 Dec 01 20:10:58 crc kubenswrapper[4802]: I1201 20:10:58.247501 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9kcsk" Dec 01 20:10:58 crc kubenswrapper[4802]: I1201 20:10:58.351099 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-tlx97" Dec 01 20:10:58 crc kubenswrapper[4802]: I1201 20:10:58.460748 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9kcsk"] Dec 01 20:10:58 crc kubenswrapper[4802]: I1201 20:10:58.541497 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-tlx97"] Dec 01 20:10:58 crc kubenswrapper[4802]: W1201 20:10:58.544959 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a21a68c_7399_4632_9564_1c0650125ea5.slice/crio-33e0ccdc4dc386465f69620fa6c8439912db5876271356c53fd839d3098509b4 WatchSource:0}: Error finding container 33e0ccdc4dc386465f69620fa6c8439912db5876271356c53fd839d3098509b4: Status 404 returned error can't find the container with id 33e0ccdc4dc386465f69620fa6c8439912db5876271356c53fd839d3098509b4 Dec 01 20:10:58 crc kubenswrapper[4802]: I1201 20:10:58.760538 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-tlx97" event={"ID":"8a21a68c-7399-4632-9564-1c0650125ea5","Type":"ContainerStarted","Data":"33e0ccdc4dc386465f69620fa6c8439912db5876271356c53fd839d3098509b4"} Dec 01 20:10:58 crc kubenswrapper[4802]: I1201 20:10:58.761711 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9kcsk" event={"ID":"b504895d-c4d4-4261-ab7d-24532e127650","Type":"ContainerStarted","Data":"8cdb8cd2165ded6aead2c59854b9d6311e3903fa10977f93ab499fbde7ba96dc"} Dec 01 20:10:58 crc kubenswrapper[4802]: I1201 20:10:58.762624 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7955fc94f9-7vczf" event={"ID":"562ca388-2e12-4577-95c5-d5307d12de5b","Type":"ContainerStarted","Data":"928aafdc03df4697bddf8c3a409bd11d0dce1b868d9a9e967fbf4ab27f8a3b65"} Dec 01 20:10:58 crc kubenswrapper[4802]: I1201 20:10:58.763375 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8g77d" event={"ID":"9923b241-3a3d-4051-b5c4-6677dff519ed","Type":"ContainerStarted","Data":"0356df63bd7cd91ebc9f9c6c892533a43465dca8f007cce4f56ab3ef04ab4a3c"} Dec 01 20:10:59 crc kubenswrapper[4802]: I1201 20:10:59.770154 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7955fc94f9-7vczf" event={"ID":"562ca388-2e12-4577-95c5-d5307d12de5b","Type":"ContainerStarted","Data":"b1834e9bffb6e0439ab848a9682ba210b14edbe964fb7c8adccdb3f79732df47"} Dec 01 20:10:59 crc kubenswrapper[4802]: I1201 20:10:59.789897 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7955fc94f9-7vczf" podStartSLOduration=2.789876143 podStartE2EDuration="2.789876143s" podCreationTimestamp="2025-12-01 20:10:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:10:59.787161908 +0000 UTC m=+881.349721569" watchObservedRunningTime="2025-12-01 20:10:59.789876143 +0000 UTC m=+881.352435794" Dec 01 20:11:00 crc kubenswrapper[4802]: I1201 20:11:00.778324 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9kcsk" event={"ID":"b504895d-c4d4-4261-ab7d-24532e127650","Type":"ContainerStarted","Data":"b15fa28497795c8aae586b750652cc4d88d0466e558b5cf97bb6e14d892b7e6e"} Dec 01 20:11:00 crc kubenswrapper[4802]: I1201 20:11:00.780116 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9kcsk" Dec 01 20:11:00 crc kubenswrapper[4802]: I1201 20:11:00.782409 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-pkwpd" event={"ID":"df60afec-8603-441f-88bb-31d054b7fea5","Type":"ContainerStarted","Data":"d138757fbae4b5f8d435fcf6ccf661e96175a38070a2be86a67241342b30a689"} Dec 01 20:11:00 crc kubenswrapper[4802]: I1201 20:11:00.783097 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-pkwpd" Dec 01 20:11:00 crc kubenswrapper[4802]: I1201 20:11:00.784741 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8g77d" event={"ID":"9923b241-3a3d-4051-b5c4-6677dff519ed","Type":"ContainerStarted","Data":"cc9525501e082946021ebcd5397db6959124206ed6134f96b32b4d3c9413cb60"} Dec 01 20:11:00 crc kubenswrapper[4802]: I1201 20:11:00.800452 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9kcsk" podStartSLOduration=2.005946798 podStartE2EDuration="3.800434035s" podCreationTimestamp="2025-12-01 20:10:57 +0000 UTC" firstStartedPulling="2025-12-01 20:10:58.471001446 +0000 UTC m=+880.033561087" lastFinishedPulling="2025-12-01 20:11:00.265488683 +0000 UTC m=+881.828048324" observedRunningTime="2025-12-01 20:11:00.797448271 +0000 UTC m=+882.360007922" watchObservedRunningTime="2025-12-01 20:11:00.800434035 +0000 UTC m=+882.362993676" Dec 01 20:11:00 crc kubenswrapper[4802]: I1201 20:11:00.814467 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-pkwpd" podStartSLOduration=1.242160601 podStartE2EDuration="3.814449722s" podCreationTimestamp="2025-12-01 20:10:57 +0000 UTC" firstStartedPulling="2025-12-01 20:10:57.692087187 +0000 UTC m=+879.254646828" lastFinishedPulling="2025-12-01 20:11:00.264376308 +0000 UTC m=+881.826935949" observedRunningTime="2025-12-01 20:11:00.81118156 +0000 UTC m=+882.373741201" watchObservedRunningTime="2025-12-01 20:11:00.814449722 +0000 UTC m=+882.377009363" Dec 01 20:11:01 crc kubenswrapper[4802]: I1201 20:11:01.793870 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-tlx97" event={"ID":"8a21a68c-7399-4632-9564-1c0650125ea5","Type":"ContainerStarted","Data":"f74f63f7fb6569e5847a48a7674c8cdc1a3ce48272c474a1f45938dcf4a7b0ac"} Dec 01 20:11:04 crc kubenswrapper[4802]: I1201 20:11:04.817056 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8g77d" event={"ID":"9923b241-3a3d-4051-b5c4-6677dff519ed","Type":"ContainerStarted","Data":"aa8b5bbbc2ec0f8654dd5e8f2dece058fe25fb3be10c1835c3060c16e5dfbc9c"} Dec 01 20:11:04 crc kubenswrapper[4802]: I1201 20:11:04.835736 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-tlx97" podStartSLOduration=5.029520701 podStartE2EDuration="7.835718555s" podCreationTimestamp="2025-12-01 20:10:57 +0000 UTC" firstStartedPulling="2025-12-01 20:10:58.546919224 +0000 UTC m=+880.109478865" lastFinishedPulling="2025-12-01 20:11:01.353117078 +0000 UTC m=+882.915676719" observedRunningTime="2025-12-01 20:11:01.80885354 +0000 UTC m=+883.371413191" watchObservedRunningTime="2025-12-01 20:11:04.835718555 +0000 UTC m=+886.398278206" Dec 01 20:11:07 crc kubenswrapper[4802]: I1201 20:11:07.687830 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-pkwpd" Dec 01 20:11:07 crc kubenswrapper[4802]: I1201 20:11:07.733473 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8g77d" podStartSLOduration=4.9773408329999995 podStartE2EDuration="10.733455214s" podCreationTimestamp="2025-12-01 20:10:57 +0000 UTC" firstStartedPulling="2025-12-01 20:10:58.041307277 +0000 UTC m=+879.603866928" lastFinishedPulling="2025-12-01 20:11:03.797421658 +0000 UTC m=+885.359981309" observedRunningTime="2025-12-01 20:11:04.834822868 +0000 UTC m=+886.397382509" watchObservedRunningTime="2025-12-01 20:11:07.733455214 +0000 UTC m=+889.296014865" Dec 01 20:11:07 crc kubenswrapper[4802]: I1201 20:11:07.988823 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7955fc94f9-7vczf" Dec 01 20:11:07 crc kubenswrapper[4802]: I1201 20:11:07.988929 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7955fc94f9-7vczf" Dec 01 20:11:07 crc kubenswrapper[4802]: I1201 20:11:07.996434 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7955fc94f9-7vczf" Dec 01 20:11:08 crc kubenswrapper[4802]: I1201 20:11:08.855348 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7955fc94f9-7vczf" Dec 01 20:11:08 crc kubenswrapper[4802]: I1201 20:11:08.910407 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-62kxj"] Dec 01 20:11:13 crc kubenswrapper[4802]: I1201 20:11:13.862971 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5zrd9"] Dec 01 20:11:13 crc kubenswrapper[4802]: I1201 20:11:13.865909 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5zrd9" Dec 01 20:11:13 crc kubenswrapper[4802]: I1201 20:11:13.874299 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5zrd9"] Dec 01 20:11:13 crc kubenswrapper[4802]: I1201 20:11:13.921684 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059-utilities\") pod \"community-operators-5zrd9\" (UID: \"c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059\") " pod="openshift-marketplace/community-operators-5zrd9" Dec 01 20:11:13 crc kubenswrapper[4802]: I1201 20:11:13.921736 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059-catalog-content\") pod \"community-operators-5zrd9\" (UID: \"c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059\") " pod="openshift-marketplace/community-operators-5zrd9" Dec 01 20:11:13 crc kubenswrapper[4802]: I1201 20:11:13.921776 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx45f\" (UniqueName: \"kubernetes.io/projected/c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059-kube-api-access-kx45f\") pod \"community-operators-5zrd9\" (UID: \"c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059\") " pod="openshift-marketplace/community-operators-5zrd9" Dec 01 20:11:14 crc kubenswrapper[4802]: I1201 20:11:14.023518 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx45f\" (UniqueName: \"kubernetes.io/projected/c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059-kube-api-access-kx45f\") pod \"community-operators-5zrd9\" (UID: \"c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059\") " pod="openshift-marketplace/community-operators-5zrd9" Dec 01 20:11:14 crc kubenswrapper[4802]: I1201 20:11:14.024137 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059-utilities\") pod \"community-operators-5zrd9\" (UID: \"c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059\") " pod="openshift-marketplace/community-operators-5zrd9" Dec 01 20:11:14 crc kubenswrapper[4802]: I1201 20:11:14.024173 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059-catalog-content\") pod \"community-operators-5zrd9\" (UID: \"c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059\") " pod="openshift-marketplace/community-operators-5zrd9" Dec 01 20:11:14 crc kubenswrapper[4802]: I1201 20:11:14.024690 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059-utilities\") pod \"community-operators-5zrd9\" (UID: \"c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059\") " pod="openshift-marketplace/community-operators-5zrd9" Dec 01 20:11:14 crc kubenswrapper[4802]: I1201 20:11:14.024961 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059-catalog-content\") pod \"community-operators-5zrd9\" (UID: \"c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059\") " pod="openshift-marketplace/community-operators-5zrd9" Dec 01 20:11:14 crc kubenswrapper[4802]: I1201 20:11:14.048579 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx45f\" (UniqueName: \"kubernetes.io/projected/c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059-kube-api-access-kx45f\") pod \"community-operators-5zrd9\" (UID: \"c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059\") " pod="openshift-marketplace/community-operators-5zrd9" Dec 01 20:11:14 crc kubenswrapper[4802]: I1201 20:11:14.198373 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5zrd9" Dec 01 20:11:14 crc kubenswrapper[4802]: I1201 20:11:14.448773 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5zrd9"] Dec 01 20:11:14 crc kubenswrapper[4802]: W1201 20:11:14.454808 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1b7fd5f_9fc6_4ae0_b747_d7ae1ac49059.slice/crio-cad9092848b89e43f6900aa11a7628d1754841200fb96b35326679cb4c9e2a6d WatchSource:0}: Error finding container cad9092848b89e43f6900aa11a7628d1754841200fb96b35326679cb4c9e2a6d: Status 404 returned error can't find the container with id cad9092848b89e43f6900aa11a7628d1754841200fb96b35326679cb4c9e2a6d Dec 01 20:11:14 crc kubenswrapper[4802]: I1201 20:11:14.892548 4802 generic.go:334] "Generic (PLEG): container finished" podID="c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059" containerID="c91790686beef9dfaa849f185235bad96e57ca9a626c676286bacb78eadd7043" exitCode=0 Dec 01 20:11:14 crc kubenswrapper[4802]: I1201 20:11:14.892652 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zrd9" event={"ID":"c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059","Type":"ContainerDied","Data":"c91790686beef9dfaa849f185235bad96e57ca9a626c676286bacb78eadd7043"} Dec 01 20:11:14 crc kubenswrapper[4802]: I1201 20:11:14.892929 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zrd9" event={"ID":"c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059","Type":"ContainerStarted","Data":"cad9092848b89e43f6900aa11a7628d1754841200fb96b35326679cb4c9e2a6d"} Dec 01 20:11:15 crc kubenswrapper[4802]: I1201 20:11:15.903727 4802 generic.go:334] "Generic (PLEG): container finished" podID="c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059" containerID="6116f90767b67bacee87a86fb9ed549fca4c621b085e2ccf8006f6fb74dbbd3b" exitCode=0 Dec 01 20:11:15 crc kubenswrapper[4802]: I1201 20:11:15.903847 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zrd9" event={"ID":"c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059","Type":"ContainerDied","Data":"6116f90767b67bacee87a86fb9ed549fca4c621b085e2ccf8006f6fb74dbbd3b"} Dec 01 20:11:16 crc kubenswrapper[4802]: I1201 20:11:16.913489 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zrd9" event={"ID":"c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059","Type":"ContainerStarted","Data":"8cefe17f1e6aa209718259b2a5a9a6233b01ae43ee46956096c898e0e708d151"} Dec 01 20:11:16 crc kubenswrapper[4802]: I1201 20:11:16.938147 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5zrd9" podStartSLOduration=2.420001153 podStartE2EDuration="3.938128682s" podCreationTimestamp="2025-12-01 20:11:13 +0000 UTC" firstStartedPulling="2025-12-01 20:11:14.894236048 +0000 UTC m=+896.456795689" lastFinishedPulling="2025-12-01 20:11:16.412363557 +0000 UTC m=+897.974923218" observedRunningTime="2025-12-01 20:11:16.934913802 +0000 UTC m=+898.497473453" watchObservedRunningTime="2025-12-01 20:11:16.938128682 +0000 UTC m=+898.500688333" Dec 01 20:11:18 crc kubenswrapper[4802]: I1201 20:11:18.256056 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-9kcsk" Dec 01 20:11:23 crc kubenswrapper[4802]: I1201 20:11:23.696290 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rttqg"] Dec 01 20:11:23 crc kubenswrapper[4802]: I1201 20:11:23.699836 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rttqg" Dec 01 20:11:23 crc kubenswrapper[4802]: I1201 20:11:23.719767 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rttqg"] Dec 01 20:11:23 crc kubenswrapper[4802]: I1201 20:11:23.782996 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmk7d\" (UniqueName: \"kubernetes.io/projected/f94dd430-2531-49f8-8fac-e6a1167b062a-kube-api-access-vmk7d\") pod \"redhat-marketplace-rttqg\" (UID: \"f94dd430-2531-49f8-8fac-e6a1167b062a\") " pod="openshift-marketplace/redhat-marketplace-rttqg" Dec 01 20:11:23 crc kubenswrapper[4802]: I1201 20:11:23.783108 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f94dd430-2531-49f8-8fac-e6a1167b062a-utilities\") pod \"redhat-marketplace-rttqg\" (UID: \"f94dd430-2531-49f8-8fac-e6a1167b062a\") " pod="openshift-marketplace/redhat-marketplace-rttqg" Dec 01 20:11:23 crc kubenswrapper[4802]: I1201 20:11:23.783144 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f94dd430-2531-49f8-8fac-e6a1167b062a-catalog-content\") pod \"redhat-marketplace-rttqg\" (UID: \"f94dd430-2531-49f8-8fac-e6a1167b062a\") " pod="openshift-marketplace/redhat-marketplace-rttqg" Dec 01 20:11:23 crc kubenswrapper[4802]: I1201 20:11:23.884530 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f94dd430-2531-49f8-8fac-e6a1167b062a-utilities\") pod \"redhat-marketplace-rttqg\" (UID: \"f94dd430-2531-49f8-8fac-e6a1167b062a\") " pod="openshift-marketplace/redhat-marketplace-rttqg" Dec 01 20:11:23 crc kubenswrapper[4802]: I1201 20:11:23.884584 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f94dd430-2531-49f8-8fac-e6a1167b062a-catalog-content\") pod \"redhat-marketplace-rttqg\" (UID: \"f94dd430-2531-49f8-8fac-e6a1167b062a\") " pod="openshift-marketplace/redhat-marketplace-rttqg" Dec 01 20:11:23 crc kubenswrapper[4802]: I1201 20:11:23.884697 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmk7d\" (UniqueName: \"kubernetes.io/projected/f94dd430-2531-49f8-8fac-e6a1167b062a-kube-api-access-vmk7d\") pod \"redhat-marketplace-rttqg\" (UID: \"f94dd430-2531-49f8-8fac-e6a1167b062a\") " pod="openshift-marketplace/redhat-marketplace-rttqg" Dec 01 20:11:23 crc kubenswrapper[4802]: I1201 20:11:23.885336 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f94dd430-2531-49f8-8fac-e6a1167b062a-utilities\") pod \"redhat-marketplace-rttqg\" (UID: \"f94dd430-2531-49f8-8fac-e6a1167b062a\") " pod="openshift-marketplace/redhat-marketplace-rttqg" Dec 01 20:11:23 crc kubenswrapper[4802]: I1201 20:11:23.885381 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f94dd430-2531-49f8-8fac-e6a1167b062a-catalog-content\") pod \"redhat-marketplace-rttqg\" (UID: \"f94dd430-2531-49f8-8fac-e6a1167b062a\") " pod="openshift-marketplace/redhat-marketplace-rttqg" Dec 01 20:11:23 crc kubenswrapper[4802]: I1201 20:11:23.910542 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmk7d\" (UniqueName: \"kubernetes.io/projected/f94dd430-2531-49f8-8fac-e6a1167b062a-kube-api-access-vmk7d\") pod \"redhat-marketplace-rttqg\" (UID: \"f94dd430-2531-49f8-8fac-e6a1167b062a\") " pod="openshift-marketplace/redhat-marketplace-rttqg" Dec 01 20:11:24 crc kubenswrapper[4802]: I1201 20:11:24.035463 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rttqg" Dec 01 20:11:24 crc kubenswrapper[4802]: I1201 20:11:24.199342 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5zrd9" Dec 01 20:11:24 crc kubenswrapper[4802]: I1201 20:11:24.201468 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5zrd9" Dec 01 20:11:24 crc kubenswrapper[4802]: I1201 20:11:24.244464 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rttqg"] Dec 01 20:11:24 crc kubenswrapper[4802]: I1201 20:11:24.274646 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5zrd9" Dec 01 20:11:24 crc kubenswrapper[4802]: I1201 20:11:24.968920 4802 generic.go:334] "Generic (PLEG): container finished" podID="f94dd430-2531-49f8-8fac-e6a1167b062a" containerID="bbb2808966a6724a28fa60b37e387dda14b7b2e3ffa0bdd65ed097a85f512ff5" exitCode=0 Dec 01 20:11:24 crc kubenswrapper[4802]: I1201 20:11:24.969818 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rttqg" event={"ID":"f94dd430-2531-49f8-8fac-e6a1167b062a","Type":"ContainerDied","Data":"bbb2808966a6724a28fa60b37e387dda14b7b2e3ffa0bdd65ed097a85f512ff5"} Dec 01 20:11:24 crc kubenswrapper[4802]: I1201 20:11:24.969850 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rttqg" event={"ID":"f94dd430-2531-49f8-8fac-e6a1167b062a","Type":"ContainerStarted","Data":"5ed67e3b935910e7e14461dd1234745fb02210703789ec67ede679cd148aafb6"} Dec 01 20:11:25 crc kubenswrapper[4802]: I1201 20:11:25.040825 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5zrd9" Dec 01 20:11:25 crc kubenswrapper[4802]: I1201 20:11:25.977694 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rttqg" event={"ID":"f94dd430-2531-49f8-8fac-e6a1167b062a","Type":"ContainerStarted","Data":"4fd461acfa90e6ebafec7b6fab187aae0e2bf661312683ef0f5012bd1fff7827"} Dec 01 20:11:26 crc kubenswrapper[4802]: I1201 20:11:26.652607 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5zrd9"] Dec 01 20:11:26 crc kubenswrapper[4802]: I1201 20:11:26.985472 4802 generic.go:334] "Generic (PLEG): container finished" podID="f94dd430-2531-49f8-8fac-e6a1167b062a" containerID="4fd461acfa90e6ebafec7b6fab187aae0e2bf661312683ef0f5012bd1fff7827" exitCode=0 Dec 01 20:11:26 crc kubenswrapper[4802]: I1201 20:11:26.985517 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rttqg" event={"ID":"f94dd430-2531-49f8-8fac-e6a1167b062a","Type":"ContainerDied","Data":"4fd461acfa90e6ebafec7b6fab187aae0e2bf661312683ef0f5012bd1fff7827"} Dec 01 20:11:27 crc kubenswrapper[4802]: I1201 20:11:27.994124 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rttqg" event={"ID":"f94dd430-2531-49f8-8fac-e6a1167b062a","Type":"ContainerStarted","Data":"d42b8bcbc529916a4a759ea6082f06750ef5ecfc8a0e792889fa0bdf2aa19111"} Dec 01 20:11:27 crc kubenswrapper[4802]: I1201 20:11:27.994339 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5zrd9" podUID="c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059" containerName="registry-server" containerID="cri-o://8cefe17f1e6aa209718259b2a5a9a6233b01ae43ee46956096c898e0e708d151" gracePeriod=2 Dec 01 20:11:28 crc kubenswrapper[4802]: I1201 20:11:28.026087 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rttqg" podStartSLOduration=2.300881996 podStartE2EDuration="5.026067315s" podCreationTimestamp="2025-12-01 20:11:23 +0000 UTC" firstStartedPulling="2025-12-01 20:11:24.971166475 +0000 UTC m=+906.533726116" lastFinishedPulling="2025-12-01 20:11:27.696351794 +0000 UTC m=+909.258911435" observedRunningTime="2025-12-01 20:11:28.020376618 +0000 UTC m=+909.582936259" watchObservedRunningTime="2025-12-01 20:11:28.026067315 +0000 UTC m=+909.588626956" Dec 01 20:11:28 crc kubenswrapper[4802]: I1201 20:11:28.089100 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:11:28 crc kubenswrapper[4802]: I1201 20:11:28.089171 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:11:28 crc kubenswrapper[4802]: I1201 20:11:28.379981 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5zrd9" Dec 01 20:11:28 crc kubenswrapper[4802]: I1201 20:11:28.464520 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx45f\" (UniqueName: \"kubernetes.io/projected/c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059-kube-api-access-kx45f\") pod \"c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059\" (UID: \"c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059\") " Dec 01 20:11:28 crc kubenswrapper[4802]: I1201 20:11:28.464899 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059-catalog-content\") pod \"c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059\" (UID: \"c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059\") " Dec 01 20:11:28 crc kubenswrapper[4802]: I1201 20:11:28.464991 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059-utilities\") pod \"c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059\" (UID: \"c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059\") " Dec 01 20:11:28 crc kubenswrapper[4802]: I1201 20:11:28.466117 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059-utilities" (OuterVolumeSpecName: "utilities") pod "c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059" (UID: "c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:11:28 crc kubenswrapper[4802]: I1201 20:11:28.473580 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059-kube-api-access-kx45f" (OuterVolumeSpecName: "kube-api-access-kx45f") pod "c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059" (UID: "c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059"). InnerVolumeSpecName "kube-api-access-kx45f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:11:28 crc kubenswrapper[4802]: I1201 20:11:28.525989 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059" (UID: "c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:11:28 crc kubenswrapper[4802]: I1201 20:11:28.566709 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 20:11:28 crc kubenswrapper[4802]: I1201 20:11:28.566746 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx45f\" (UniqueName: \"kubernetes.io/projected/c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059-kube-api-access-kx45f\") on node \"crc\" DevicePath \"\"" Dec 01 20:11:28 crc kubenswrapper[4802]: I1201 20:11:28.566758 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 20:11:29 crc kubenswrapper[4802]: I1201 20:11:29.004524 4802 generic.go:334] "Generic (PLEG): container finished" podID="c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059" containerID="8cefe17f1e6aa209718259b2a5a9a6233b01ae43ee46956096c898e0e708d151" exitCode=0 Dec 01 20:11:29 crc kubenswrapper[4802]: I1201 20:11:29.004588 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5zrd9" Dec 01 20:11:29 crc kubenswrapper[4802]: I1201 20:11:29.004588 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zrd9" event={"ID":"c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059","Type":"ContainerDied","Data":"8cefe17f1e6aa209718259b2a5a9a6233b01ae43ee46956096c898e0e708d151"} Dec 01 20:11:29 crc kubenswrapper[4802]: I1201 20:11:29.004670 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zrd9" event={"ID":"c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059","Type":"ContainerDied","Data":"cad9092848b89e43f6900aa11a7628d1754841200fb96b35326679cb4c9e2a6d"} Dec 01 20:11:29 crc kubenswrapper[4802]: I1201 20:11:29.004689 4802 scope.go:117] "RemoveContainer" containerID="8cefe17f1e6aa209718259b2a5a9a6233b01ae43ee46956096c898e0e708d151" Dec 01 20:11:29 crc kubenswrapper[4802]: I1201 20:11:29.029295 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5zrd9"] Dec 01 20:11:29 crc kubenswrapper[4802]: I1201 20:11:29.033441 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5zrd9"] Dec 01 20:11:29 crc kubenswrapper[4802]: I1201 20:11:29.035045 4802 scope.go:117] "RemoveContainer" containerID="6116f90767b67bacee87a86fb9ed549fca4c621b085e2ccf8006f6fb74dbbd3b" Dec 01 20:11:29 crc kubenswrapper[4802]: I1201 20:11:29.054415 4802 scope.go:117] "RemoveContainer" containerID="c91790686beef9dfaa849f185235bad96e57ca9a626c676286bacb78eadd7043" Dec 01 20:11:29 crc kubenswrapper[4802]: I1201 20:11:29.090664 4802 scope.go:117] "RemoveContainer" containerID="8cefe17f1e6aa209718259b2a5a9a6233b01ae43ee46956096c898e0e708d151" Dec 01 20:11:29 crc kubenswrapper[4802]: E1201 20:11:29.091538 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cefe17f1e6aa209718259b2a5a9a6233b01ae43ee46956096c898e0e708d151\": container with ID starting with 8cefe17f1e6aa209718259b2a5a9a6233b01ae43ee46956096c898e0e708d151 not found: ID does not exist" containerID="8cefe17f1e6aa209718259b2a5a9a6233b01ae43ee46956096c898e0e708d151" Dec 01 20:11:29 crc kubenswrapper[4802]: I1201 20:11:29.091590 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cefe17f1e6aa209718259b2a5a9a6233b01ae43ee46956096c898e0e708d151"} err="failed to get container status \"8cefe17f1e6aa209718259b2a5a9a6233b01ae43ee46956096c898e0e708d151\": rpc error: code = NotFound desc = could not find container \"8cefe17f1e6aa209718259b2a5a9a6233b01ae43ee46956096c898e0e708d151\": container with ID starting with 8cefe17f1e6aa209718259b2a5a9a6233b01ae43ee46956096c898e0e708d151 not found: ID does not exist" Dec 01 20:11:29 crc kubenswrapper[4802]: I1201 20:11:29.091617 4802 scope.go:117] "RemoveContainer" containerID="6116f90767b67bacee87a86fb9ed549fca4c621b085e2ccf8006f6fb74dbbd3b" Dec 01 20:11:29 crc kubenswrapper[4802]: E1201 20:11:29.092092 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6116f90767b67bacee87a86fb9ed549fca4c621b085e2ccf8006f6fb74dbbd3b\": container with ID starting with 6116f90767b67bacee87a86fb9ed549fca4c621b085e2ccf8006f6fb74dbbd3b not found: ID does not exist" containerID="6116f90767b67bacee87a86fb9ed549fca4c621b085e2ccf8006f6fb74dbbd3b" Dec 01 20:11:29 crc kubenswrapper[4802]: I1201 20:11:29.092115 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6116f90767b67bacee87a86fb9ed549fca4c621b085e2ccf8006f6fb74dbbd3b"} err="failed to get container status \"6116f90767b67bacee87a86fb9ed549fca4c621b085e2ccf8006f6fb74dbbd3b\": rpc error: code = NotFound desc = could not find container \"6116f90767b67bacee87a86fb9ed549fca4c621b085e2ccf8006f6fb74dbbd3b\": container with ID starting with 6116f90767b67bacee87a86fb9ed549fca4c621b085e2ccf8006f6fb74dbbd3b not found: ID does not exist" Dec 01 20:11:29 crc kubenswrapper[4802]: I1201 20:11:29.092131 4802 scope.go:117] "RemoveContainer" containerID="c91790686beef9dfaa849f185235bad96e57ca9a626c676286bacb78eadd7043" Dec 01 20:11:29 crc kubenswrapper[4802]: E1201 20:11:29.092556 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c91790686beef9dfaa849f185235bad96e57ca9a626c676286bacb78eadd7043\": container with ID starting with c91790686beef9dfaa849f185235bad96e57ca9a626c676286bacb78eadd7043 not found: ID does not exist" containerID="c91790686beef9dfaa849f185235bad96e57ca9a626c676286bacb78eadd7043" Dec 01 20:11:29 crc kubenswrapper[4802]: I1201 20:11:29.092589 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c91790686beef9dfaa849f185235bad96e57ca9a626c676286bacb78eadd7043"} err="failed to get container status \"c91790686beef9dfaa849f185235bad96e57ca9a626c676286bacb78eadd7043\": rpc error: code = NotFound desc = could not find container \"c91790686beef9dfaa849f185235bad96e57ca9a626c676286bacb78eadd7043\": container with ID starting with c91790686beef9dfaa849f185235bad96e57ca9a626c676286bacb78eadd7043 not found: ID does not exist" Dec 01 20:11:30 crc kubenswrapper[4802]: I1201 20:11:30.730267 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059" path="/var/lib/kubelet/pods/c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059/volumes" Dec 01 20:11:33 crc kubenswrapper[4802]: I1201 20:11:33.982100 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-62kxj" podUID="9f3283e5-38b2-4f3e-a0d4-122b734e79d4" containerName="console" containerID="cri-o://b0db5e51eb0fcea0710fd46dd7bfd5394fec1eb11e12a9164997990571929467" gracePeriod=15 Dec 01 20:11:34 crc kubenswrapper[4802]: I1201 20:11:34.036123 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rttqg" Dec 01 20:11:34 crc kubenswrapper[4802]: I1201 20:11:34.036211 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rttqg" Dec 01 20:11:34 crc kubenswrapper[4802]: I1201 20:11:34.082223 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rttqg" Dec 01 20:11:34 crc kubenswrapper[4802]: I1201 20:11:34.600728 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s"] Dec 01 20:11:34 crc kubenswrapper[4802]: E1201 20:11:34.601338 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059" containerName="extract-utilities" Dec 01 20:11:34 crc kubenswrapper[4802]: I1201 20:11:34.601356 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059" containerName="extract-utilities" Dec 01 20:11:34 crc kubenswrapper[4802]: E1201 20:11:34.601372 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059" containerName="extract-content" Dec 01 20:11:34 crc kubenswrapper[4802]: I1201 20:11:34.601380 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059" containerName="extract-content" Dec 01 20:11:34 crc kubenswrapper[4802]: E1201 20:11:34.601410 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059" containerName="registry-server" Dec 01 20:11:34 crc kubenswrapper[4802]: I1201 20:11:34.601422 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059" containerName="registry-server" Dec 01 20:11:34 crc kubenswrapper[4802]: I1201 20:11:34.601580 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1b7fd5f-9fc6-4ae0-b747-d7ae1ac49059" containerName="registry-server" Dec 01 20:11:34 crc kubenswrapper[4802]: I1201 20:11:34.602587 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s" Dec 01 20:11:34 crc kubenswrapper[4802]: I1201 20:11:34.605749 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 01 20:11:34 crc kubenswrapper[4802]: I1201 20:11:34.618442 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s"] Dec 01 20:11:34 crc kubenswrapper[4802]: I1201 20:11:34.767217 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjq6t\" (UniqueName: \"kubernetes.io/projected/f0139070-b593-45e2-9098-969b27b2be38-kube-api-access-pjq6t\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s\" (UID: \"f0139070-b593-45e2-9098-969b27b2be38\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s" Dec 01 20:11:34 crc kubenswrapper[4802]: I1201 20:11:34.767302 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0139070-b593-45e2-9098-969b27b2be38-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s\" (UID: \"f0139070-b593-45e2-9098-969b27b2be38\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s" Dec 01 20:11:34 crc kubenswrapper[4802]: I1201 20:11:34.767435 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0139070-b593-45e2-9098-969b27b2be38-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s\" (UID: \"f0139070-b593-45e2-9098-969b27b2be38\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s" Dec 01 20:11:34 crc kubenswrapper[4802]: I1201 20:11:34.869850 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0139070-b593-45e2-9098-969b27b2be38-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s\" (UID: \"f0139070-b593-45e2-9098-969b27b2be38\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s" Dec 01 20:11:34 crc kubenswrapper[4802]: I1201 20:11:34.870124 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjq6t\" (UniqueName: \"kubernetes.io/projected/f0139070-b593-45e2-9098-969b27b2be38-kube-api-access-pjq6t\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s\" (UID: \"f0139070-b593-45e2-9098-969b27b2be38\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s" Dec 01 20:11:34 crc kubenswrapper[4802]: I1201 20:11:34.870262 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0139070-b593-45e2-9098-969b27b2be38-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s\" (UID: \"f0139070-b593-45e2-9098-969b27b2be38\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s" Dec 01 20:11:34 crc kubenswrapper[4802]: I1201 20:11:34.871397 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0139070-b593-45e2-9098-969b27b2be38-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s\" (UID: \"f0139070-b593-45e2-9098-969b27b2be38\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s" Dec 01 20:11:34 crc kubenswrapper[4802]: I1201 20:11:34.871791 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0139070-b593-45e2-9098-969b27b2be38-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s\" (UID: \"f0139070-b593-45e2-9098-969b27b2be38\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s" Dec 01 20:11:34 crc kubenswrapper[4802]: I1201 20:11:34.911150 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjq6t\" (UniqueName: \"kubernetes.io/projected/f0139070-b593-45e2-9098-969b27b2be38-kube-api-access-pjq6t\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s\" (UID: \"f0139070-b593-45e2-9098-969b27b2be38\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s" Dec 01 20:11:34 crc kubenswrapper[4802]: I1201 20:11:34.922291 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s" Dec 01 20:11:35 crc kubenswrapper[4802]: I1201 20:11:35.064254 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-62kxj_9f3283e5-38b2-4f3e-a0d4-122b734e79d4/console/0.log" Dec 01 20:11:35 crc kubenswrapper[4802]: I1201 20:11:35.064773 4802 generic.go:334] "Generic (PLEG): container finished" podID="9f3283e5-38b2-4f3e-a0d4-122b734e79d4" containerID="b0db5e51eb0fcea0710fd46dd7bfd5394fec1eb11e12a9164997990571929467" exitCode=2 Dec 01 20:11:35 crc kubenswrapper[4802]: I1201 20:11:35.065353 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-62kxj" event={"ID":"9f3283e5-38b2-4f3e-a0d4-122b734e79d4","Type":"ContainerDied","Data":"b0db5e51eb0fcea0710fd46dd7bfd5394fec1eb11e12a9164997990571929467"} Dec 01 20:11:35 crc kubenswrapper[4802]: I1201 20:11:35.123456 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rttqg" Dec 01 20:11:35 crc kubenswrapper[4802]: I1201 20:11:35.195250 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s"] Dec 01 20:11:35 crc kubenswrapper[4802]: I1201 20:11:35.457871 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-62kxj_9f3283e5-38b2-4f3e-a0d4-122b734e79d4/console/0.log" Dec 01 20:11:35 crc kubenswrapper[4802]: I1201 20:11:35.457979 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-62kxj" Dec 01 20:11:35 crc kubenswrapper[4802]: I1201 20:11:35.592958 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-service-ca\") pod \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\" (UID: \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\") " Dec 01 20:11:35 crc kubenswrapper[4802]: I1201 20:11:35.593009 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c852k\" (UniqueName: \"kubernetes.io/projected/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-kube-api-access-c852k\") pod \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\" (UID: \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\") " Dec 01 20:11:35 crc kubenswrapper[4802]: I1201 20:11:35.593053 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-console-config\") pod \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\" (UID: \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\") " Dec 01 20:11:35 crc kubenswrapper[4802]: I1201 20:11:35.593076 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-console-serving-cert\") pod \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\" (UID: \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\") " Dec 01 20:11:35 crc kubenswrapper[4802]: I1201 20:11:35.593122 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-console-oauth-config\") pod \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\" (UID: \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\") " Dec 01 20:11:35 crc kubenswrapper[4802]: I1201 20:11:35.593141 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-oauth-serving-cert\") pod \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\" (UID: \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\") " Dec 01 20:11:35 crc kubenswrapper[4802]: I1201 20:11:35.593162 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-trusted-ca-bundle\") pod \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\" (UID: \"9f3283e5-38b2-4f3e-a0d4-122b734e79d4\") " Dec 01 20:11:35 crc kubenswrapper[4802]: I1201 20:11:35.594482 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-console-config" (OuterVolumeSpecName: "console-config") pod "9f3283e5-38b2-4f3e-a0d4-122b734e79d4" (UID: "9f3283e5-38b2-4f3e-a0d4-122b734e79d4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:11:35 crc kubenswrapper[4802]: I1201 20:11:35.594548 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9f3283e5-38b2-4f3e-a0d4-122b734e79d4" (UID: "9f3283e5-38b2-4f3e-a0d4-122b734e79d4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:11:35 crc kubenswrapper[4802]: I1201 20:11:35.594609 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-service-ca" (OuterVolumeSpecName: "service-ca") pod "9f3283e5-38b2-4f3e-a0d4-122b734e79d4" (UID: "9f3283e5-38b2-4f3e-a0d4-122b734e79d4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:11:35 crc kubenswrapper[4802]: I1201 20:11:35.594618 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9f3283e5-38b2-4f3e-a0d4-122b734e79d4" (UID: "9f3283e5-38b2-4f3e-a0d4-122b734e79d4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:11:35 crc kubenswrapper[4802]: I1201 20:11:35.601814 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9f3283e5-38b2-4f3e-a0d4-122b734e79d4" (UID: "9f3283e5-38b2-4f3e-a0d4-122b734e79d4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:11:35 crc kubenswrapper[4802]: I1201 20:11:35.601847 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-kube-api-access-c852k" (OuterVolumeSpecName: "kube-api-access-c852k") pod "9f3283e5-38b2-4f3e-a0d4-122b734e79d4" (UID: "9f3283e5-38b2-4f3e-a0d4-122b734e79d4"). InnerVolumeSpecName "kube-api-access-c852k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:11:35 crc kubenswrapper[4802]: I1201 20:11:35.602494 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9f3283e5-38b2-4f3e-a0d4-122b734e79d4" (UID: "9f3283e5-38b2-4f3e-a0d4-122b734e79d4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:11:35 crc kubenswrapper[4802]: I1201 20:11:35.695483 4802 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:11:35 crc kubenswrapper[4802]: I1201 20:11:35.695595 4802 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 20:11:35 crc kubenswrapper[4802]: I1201 20:11:35.695616 4802 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:11:35 crc kubenswrapper[4802]: I1201 20:11:35.695636 4802 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 20:11:35 crc kubenswrapper[4802]: I1201 20:11:35.695657 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c852k\" (UniqueName: \"kubernetes.io/projected/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-kube-api-access-c852k\") on node \"crc\" DevicePath \"\"" Dec 01 20:11:35 crc kubenswrapper[4802]: I1201 20:11:35.695679 4802 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-console-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:11:35 crc kubenswrapper[4802]: I1201 20:11:35.695697 4802 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f3283e5-38b2-4f3e-a0d4-122b734e79d4-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 20:11:36 crc kubenswrapper[4802]: I1201 20:11:36.071990 4802 generic.go:334] "Generic (PLEG): container finished" podID="f0139070-b593-45e2-9098-969b27b2be38" containerID="bd51ed7c86db01402278ad49c131e0645e7dc72b47b2a3925d6769755b87842b" exitCode=0 Dec 01 20:11:36 crc kubenswrapper[4802]: I1201 20:11:36.072077 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s" event={"ID":"f0139070-b593-45e2-9098-969b27b2be38","Type":"ContainerDied","Data":"bd51ed7c86db01402278ad49c131e0645e7dc72b47b2a3925d6769755b87842b"} Dec 01 20:11:36 crc kubenswrapper[4802]: I1201 20:11:36.072136 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s" event={"ID":"f0139070-b593-45e2-9098-969b27b2be38","Type":"ContainerStarted","Data":"3555c0467ca1d327a586a6961d2e6a02a785bf7e22621be0d06d8dc4fbc49c0e"} Dec 01 20:11:36 crc kubenswrapper[4802]: I1201 20:11:36.073981 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-62kxj_9f3283e5-38b2-4f3e-a0d4-122b734e79d4/console/0.log" Dec 01 20:11:36 crc kubenswrapper[4802]: I1201 20:11:36.074178 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-62kxj" Dec 01 20:11:36 crc kubenswrapper[4802]: I1201 20:11:36.074706 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-62kxj" event={"ID":"9f3283e5-38b2-4f3e-a0d4-122b734e79d4","Type":"ContainerDied","Data":"0fbb1831ea417e8b84a035f0d6c021dc5b05cad0e34350f16864f4d0cf9eb227"} Dec 01 20:11:36 crc kubenswrapper[4802]: I1201 20:11:36.074797 4802 scope.go:117] "RemoveContainer" containerID="b0db5e51eb0fcea0710fd46dd7bfd5394fec1eb11e12a9164997990571929467" Dec 01 20:11:36 crc kubenswrapper[4802]: I1201 20:11:36.125060 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-62kxj"] Dec 01 20:11:36 crc kubenswrapper[4802]: I1201 20:11:36.131025 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-62kxj"] Dec 01 20:11:36 crc kubenswrapper[4802]: I1201 20:11:36.736528 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f3283e5-38b2-4f3e-a0d4-122b734e79d4" path="/var/lib/kubelet/pods/9f3283e5-38b2-4f3e-a0d4-122b734e79d4/volumes" Dec 01 20:11:38 crc kubenswrapper[4802]: I1201 20:11:38.094398 4802 generic.go:334] "Generic (PLEG): container finished" podID="f0139070-b593-45e2-9098-969b27b2be38" containerID="70f754e6262ca72212946536c555b1b15fad0f129d8db9f95d5502a171cdf3a0" exitCode=0 Dec 01 20:11:38 crc kubenswrapper[4802]: I1201 20:11:38.094537 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s" event={"ID":"f0139070-b593-45e2-9098-969b27b2be38","Type":"ContainerDied","Data":"70f754e6262ca72212946536c555b1b15fad0f129d8db9f95d5502a171cdf3a0"} Dec 01 20:11:38 crc kubenswrapper[4802]: I1201 20:11:38.327505 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rttqg"] Dec 01 20:11:38 crc kubenswrapper[4802]: I1201 20:11:38.327853 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rttqg" podUID="f94dd430-2531-49f8-8fac-e6a1167b062a" containerName="registry-server" containerID="cri-o://d42b8bcbc529916a4a759ea6082f06750ef5ecfc8a0e792889fa0bdf2aa19111" gracePeriod=2 Dec 01 20:11:38 crc kubenswrapper[4802]: I1201 20:11:38.747422 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rttqg" Dec 01 20:11:38 crc kubenswrapper[4802]: I1201 20:11:38.850149 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmk7d\" (UniqueName: \"kubernetes.io/projected/f94dd430-2531-49f8-8fac-e6a1167b062a-kube-api-access-vmk7d\") pod \"f94dd430-2531-49f8-8fac-e6a1167b062a\" (UID: \"f94dd430-2531-49f8-8fac-e6a1167b062a\") " Dec 01 20:11:38 crc kubenswrapper[4802]: I1201 20:11:38.850287 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f94dd430-2531-49f8-8fac-e6a1167b062a-utilities\") pod \"f94dd430-2531-49f8-8fac-e6a1167b062a\" (UID: \"f94dd430-2531-49f8-8fac-e6a1167b062a\") " Dec 01 20:11:38 crc kubenswrapper[4802]: I1201 20:11:38.850411 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f94dd430-2531-49f8-8fac-e6a1167b062a-catalog-content\") pod \"f94dd430-2531-49f8-8fac-e6a1167b062a\" (UID: \"f94dd430-2531-49f8-8fac-e6a1167b062a\") " Dec 01 20:11:38 crc kubenswrapper[4802]: I1201 20:11:38.851681 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f94dd430-2531-49f8-8fac-e6a1167b062a-utilities" (OuterVolumeSpecName: "utilities") pod "f94dd430-2531-49f8-8fac-e6a1167b062a" (UID: "f94dd430-2531-49f8-8fac-e6a1167b062a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:11:38 crc kubenswrapper[4802]: I1201 20:11:38.857081 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f94dd430-2531-49f8-8fac-e6a1167b062a-kube-api-access-vmk7d" (OuterVolumeSpecName: "kube-api-access-vmk7d") pod "f94dd430-2531-49f8-8fac-e6a1167b062a" (UID: "f94dd430-2531-49f8-8fac-e6a1167b062a"). InnerVolumeSpecName "kube-api-access-vmk7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:11:38 crc kubenswrapper[4802]: I1201 20:11:38.868389 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f94dd430-2531-49f8-8fac-e6a1167b062a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f94dd430-2531-49f8-8fac-e6a1167b062a" (UID: "f94dd430-2531-49f8-8fac-e6a1167b062a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:11:38 crc kubenswrapper[4802]: I1201 20:11:38.953239 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f94dd430-2531-49f8-8fac-e6a1167b062a-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 20:11:38 crc kubenswrapper[4802]: I1201 20:11:38.953287 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f94dd430-2531-49f8-8fac-e6a1167b062a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 20:11:38 crc kubenswrapper[4802]: I1201 20:11:38.953326 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmk7d\" (UniqueName: \"kubernetes.io/projected/f94dd430-2531-49f8-8fac-e6a1167b062a-kube-api-access-vmk7d\") on node \"crc\" DevicePath \"\"" Dec 01 20:11:39 crc kubenswrapper[4802]: I1201 20:11:39.102731 4802 generic.go:334] "Generic (PLEG): container finished" podID="f0139070-b593-45e2-9098-969b27b2be38" containerID="693a0491a7fdc0b9dc01827b21be19b0d850ad3349421420ba4dff4e9f315c40" exitCode=0 Dec 01 20:11:39 crc kubenswrapper[4802]: I1201 20:11:39.102831 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s" event={"ID":"f0139070-b593-45e2-9098-969b27b2be38","Type":"ContainerDied","Data":"693a0491a7fdc0b9dc01827b21be19b0d850ad3349421420ba4dff4e9f315c40"} Dec 01 20:11:39 crc kubenswrapper[4802]: I1201 20:11:39.107043 4802 generic.go:334] "Generic (PLEG): container finished" podID="f94dd430-2531-49f8-8fac-e6a1167b062a" containerID="d42b8bcbc529916a4a759ea6082f06750ef5ecfc8a0e792889fa0bdf2aa19111" exitCode=0 Dec 01 20:11:39 crc kubenswrapper[4802]: I1201 20:11:39.107275 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rttqg" event={"ID":"f94dd430-2531-49f8-8fac-e6a1167b062a","Type":"ContainerDied","Data":"d42b8bcbc529916a4a759ea6082f06750ef5ecfc8a0e792889fa0bdf2aa19111"} Dec 01 20:11:39 crc kubenswrapper[4802]: I1201 20:11:39.107444 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rttqg" event={"ID":"f94dd430-2531-49f8-8fac-e6a1167b062a","Type":"ContainerDied","Data":"5ed67e3b935910e7e14461dd1234745fb02210703789ec67ede679cd148aafb6"} Dec 01 20:11:39 crc kubenswrapper[4802]: I1201 20:11:39.107540 4802 scope.go:117] "RemoveContainer" containerID="d42b8bcbc529916a4a759ea6082f06750ef5ecfc8a0e792889fa0bdf2aa19111" Dec 01 20:11:39 crc kubenswrapper[4802]: I1201 20:11:39.107478 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rttqg" Dec 01 20:11:39 crc kubenswrapper[4802]: I1201 20:11:39.139982 4802 scope.go:117] "RemoveContainer" containerID="4fd461acfa90e6ebafec7b6fab187aae0e2bf661312683ef0f5012bd1fff7827" Dec 01 20:11:39 crc kubenswrapper[4802]: I1201 20:11:39.153339 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rttqg"] Dec 01 20:11:39 crc kubenswrapper[4802]: I1201 20:11:39.162711 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rttqg"] Dec 01 20:11:39 crc kubenswrapper[4802]: I1201 20:11:39.169018 4802 scope.go:117] "RemoveContainer" containerID="bbb2808966a6724a28fa60b37e387dda14b7b2e3ffa0bdd65ed097a85f512ff5" Dec 01 20:11:39 crc kubenswrapper[4802]: I1201 20:11:39.189720 4802 scope.go:117] "RemoveContainer" containerID="d42b8bcbc529916a4a759ea6082f06750ef5ecfc8a0e792889fa0bdf2aa19111" Dec 01 20:11:39 crc kubenswrapper[4802]: E1201 20:11:39.190505 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d42b8bcbc529916a4a759ea6082f06750ef5ecfc8a0e792889fa0bdf2aa19111\": container with ID starting with d42b8bcbc529916a4a759ea6082f06750ef5ecfc8a0e792889fa0bdf2aa19111 not found: ID does not exist" containerID="d42b8bcbc529916a4a759ea6082f06750ef5ecfc8a0e792889fa0bdf2aa19111" Dec 01 20:11:39 crc kubenswrapper[4802]: I1201 20:11:39.190630 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d42b8bcbc529916a4a759ea6082f06750ef5ecfc8a0e792889fa0bdf2aa19111"} err="failed to get container status \"d42b8bcbc529916a4a759ea6082f06750ef5ecfc8a0e792889fa0bdf2aa19111\": rpc error: code = NotFound desc = could not find container \"d42b8bcbc529916a4a759ea6082f06750ef5ecfc8a0e792889fa0bdf2aa19111\": container with ID starting with d42b8bcbc529916a4a759ea6082f06750ef5ecfc8a0e792889fa0bdf2aa19111 not found: ID does not exist" Dec 01 20:11:39 crc kubenswrapper[4802]: I1201 20:11:39.190769 4802 scope.go:117] "RemoveContainer" containerID="4fd461acfa90e6ebafec7b6fab187aae0e2bf661312683ef0f5012bd1fff7827" Dec 01 20:11:39 crc kubenswrapper[4802]: E1201 20:11:39.191122 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fd461acfa90e6ebafec7b6fab187aae0e2bf661312683ef0f5012bd1fff7827\": container with ID starting with 4fd461acfa90e6ebafec7b6fab187aae0e2bf661312683ef0f5012bd1fff7827 not found: ID does not exist" containerID="4fd461acfa90e6ebafec7b6fab187aae0e2bf661312683ef0f5012bd1fff7827" Dec 01 20:11:39 crc kubenswrapper[4802]: I1201 20:11:39.191255 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fd461acfa90e6ebafec7b6fab187aae0e2bf661312683ef0f5012bd1fff7827"} err="failed to get container status \"4fd461acfa90e6ebafec7b6fab187aae0e2bf661312683ef0f5012bd1fff7827\": rpc error: code = NotFound desc = could not find container \"4fd461acfa90e6ebafec7b6fab187aae0e2bf661312683ef0f5012bd1fff7827\": container with ID starting with 4fd461acfa90e6ebafec7b6fab187aae0e2bf661312683ef0f5012bd1fff7827 not found: ID does not exist" Dec 01 20:11:39 crc kubenswrapper[4802]: I1201 20:11:39.191348 4802 scope.go:117] "RemoveContainer" containerID="bbb2808966a6724a28fa60b37e387dda14b7b2e3ffa0bdd65ed097a85f512ff5" Dec 01 20:11:39 crc kubenswrapper[4802]: E1201 20:11:39.192029 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbb2808966a6724a28fa60b37e387dda14b7b2e3ffa0bdd65ed097a85f512ff5\": container with ID starting with bbb2808966a6724a28fa60b37e387dda14b7b2e3ffa0bdd65ed097a85f512ff5 not found: ID does not exist" containerID="bbb2808966a6724a28fa60b37e387dda14b7b2e3ffa0bdd65ed097a85f512ff5" Dec 01 20:11:39 crc kubenswrapper[4802]: I1201 20:11:39.192129 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbb2808966a6724a28fa60b37e387dda14b7b2e3ffa0bdd65ed097a85f512ff5"} err="failed to get container status \"bbb2808966a6724a28fa60b37e387dda14b7b2e3ffa0bdd65ed097a85f512ff5\": rpc error: code = NotFound desc = could not find container \"bbb2808966a6724a28fa60b37e387dda14b7b2e3ffa0bdd65ed097a85f512ff5\": container with ID starting with bbb2808966a6724a28fa60b37e387dda14b7b2e3ffa0bdd65ed097a85f512ff5 not found: ID does not exist" Dec 01 20:11:40 crc kubenswrapper[4802]: I1201 20:11:40.357635 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s" Dec 01 20:11:40 crc kubenswrapper[4802]: I1201 20:11:40.477523 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0139070-b593-45e2-9098-969b27b2be38-bundle\") pod \"f0139070-b593-45e2-9098-969b27b2be38\" (UID: \"f0139070-b593-45e2-9098-969b27b2be38\") " Dec 01 20:11:40 crc kubenswrapper[4802]: I1201 20:11:40.477680 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0139070-b593-45e2-9098-969b27b2be38-util\") pod \"f0139070-b593-45e2-9098-969b27b2be38\" (UID: \"f0139070-b593-45e2-9098-969b27b2be38\") " Dec 01 20:11:40 crc kubenswrapper[4802]: I1201 20:11:40.477846 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjq6t\" (UniqueName: \"kubernetes.io/projected/f0139070-b593-45e2-9098-969b27b2be38-kube-api-access-pjq6t\") pod \"f0139070-b593-45e2-9098-969b27b2be38\" (UID: \"f0139070-b593-45e2-9098-969b27b2be38\") " Dec 01 20:11:40 crc kubenswrapper[4802]: I1201 20:11:40.479500 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0139070-b593-45e2-9098-969b27b2be38-bundle" (OuterVolumeSpecName: "bundle") pod "f0139070-b593-45e2-9098-969b27b2be38" (UID: "f0139070-b593-45e2-9098-969b27b2be38"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:11:40 crc kubenswrapper[4802]: I1201 20:11:40.487312 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0139070-b593-45e2-9098-969b27b2be38-kube-api-access-pjq6t" (OuterVolumeSpecName: "kube-api-access-pjq6t") pod "f0139070-b593-45e2-9098-969b27b2be38" (UID: "f0139070-b593-45e2-9098-969b27b2be38"). InnerVolumeSpecName "kube-api-access-pjq6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:11:40 crc kubenswrapper[4802]: I1201 20:11:40.491228 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0139070-b593-45e2-9098-969b27b2be38-util" (OuterVolumeSpecName: "util") pod "f0139070-b593-45e2-9098-969b27b2be38" (UID: "f0139070-b593-45e2-9098-969b27b2be38"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:11:40 crc kubenswrapper[4802]: I1201 20:11:40.579374 4802 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0139070-b593-45e2-9098-969b27b2be38-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:11:40 crc kubenswrapper[4802]: I1201 20:11:40.579823 4802 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0139070-b593-45e2-9098-969b27b2be38-util\") on node \"crc\" DevicePath \"\"" Dec 01 20:11:40 crc kubenswrapper[4802]: I1201 20:11:40.579897 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjq6t\" (UniqueName: \"kubernetes.io/projected/f0139070-b593-45e2-9098-969b27b2be38-kube-api-access-pjq6t\") on node \"crc\" DevicePath \"\"" Dec 01 20:11:40 crc kubenswrapper[4802]: I1201 20:11:40.727873 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f94dd430-2531-49f8-8fac-e6a1167b062a" path="/var/lib/kubelet/pods/f94dd430-2531-49f8-8fac-e6a1167b062a/volumes" Dec 01 20:11:41 crc kubenswrapper[4802]: I1201 20:11:41.131600 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s" event={"ID":"f0139070-b593-45e2-9098-969b27b2be38","Type":"ContainerDied","Data":"3555c0467ca1d327a586a6961d2e6a02a785bf7e22621be0d06d8dc4fbc49c0e"} Dec 01 20:11:41 crc kubenswrapper[4802]: I1201 20:11:41.131688 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3555c0467ca1d327a586a6961d2e6a02a785bf7e22621be0d06d8dc4fbc49c0e" Dec 01 20:11:41 crc kubenswrapper[4802]: I1201 20:11:41.131711 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.458803 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-79cd97f75d-bjkc2"] Dec 01 20:11:51 crc kubenswrapper[4802]: E1201 20:11:51.459998 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f94dd430-2531-49f8-8fac-e6a1167b062a" containerName="extract-utilities" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.460016 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="f94dd430-2531-49f8-8fac-e6a1167b062a" containerName="extract-utilities" Dec 01 20:11:51 crc kubenswrapper[4802]: E1201 20:11:51.460027 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0139070-b593-45e2-9098-969b27b2be38" containerName="extract" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.460033 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0139070-b593-45e2-9098-969b27b2be38" containerName="extract" Dec 01 20:11:51 crc kubenswrapper[4802]: E1201 20:11:51.460042 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f3283e5-38b2-4f3e-a0d4-122b734e79d4" containerName="console" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.460049 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f3283e5-38b2-4f3e-a0d4-122b734e79d4" containerName="console" Dec 01 20:11:51 crc kubenswrapper[4802]: E1201 20:11:51.460064 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0139070-b593-45e2-9098-969b27b2be38" containerName="pull" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.460073 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0139070-b593-45e2-9098-969b27b2be38" containerName="pull" Dec 01 20:11:51 crc kubenswrapper[4802]: E1201 20:11:51.460084 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f94dd430-2531-49f8-8fac-e6a1167b062a" containerName="registry-server" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.460092 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="f94dd430-2531-49f8-8fac-e6a1167b062a" containerName="registry-server" Dec 01 20:11:51 crc kubenswrapper[4802]: E1201 20:11:51.460103 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0139070-b593-45e2-9098-969b27b2be38" containerName="util" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.460109 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0139070-b593-45e2-9098-969b27b2be38" containerName="util" Dec 01 20:11:51 crc kubenswrapper[4802]: E1201 20:11:51.460138 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f94dd430-2531-49f8-8fac-e6a1167b062a" containerName="extract-content" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.460145 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="f94dd430-2531-49f8-8fac-e6a1167b062a" containerName="extract-content" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.460282 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f3283e5-38b2-4f3e-a0d4-122b734e79d4" containerName="console" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.460297 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="f94dd430-2531-49f8-8fac-e6a1167b062a" containerName="registry-server" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.460309 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0139070-b593-45e2-9098-969b27b2be38" containerName="extract" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.460802 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-79cd97f75d-bjkc2" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.463532 4802 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-6wv4f" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.464138 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.464149 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.464601 4802 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.481079 4802 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.493017 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-79cd97f75d-bjkc2"] Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.551672 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/27545afc-bda6-468c-b9c2-8ab3182546c8-webhook-cert\") pod \"metallb-operator-controller-manager-79cd97f75d-bjkc2\" (UID: \"27545afc-bda6-468c-b9c2-8ab3182546c8\") " pod="metallb-system/metallb-operator-controller-manager-79cd97f75d-bjkc2" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.551770 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/27545afc-bda6-468c-b9c2-8ab3182546c8-apiservice-cert\") pod \"metallb-operator-controller-manager-79cd97f75d-bjkc2\" (UID: \"27545afc-bda6-468c-b9c2-8ab3182546c8\") " pod="metallb-system/metallb-operator-controller-manager-79cd97f75d-bjkc2" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.551833 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67mtf\" (UniqueName: \"kubernetes.io/projected/27545afc-bda6-468c-b9c2-8ab3182546c8-kube-api-access-67mtf\") pod \"metallb-operator-controller-manager-79cd97f75d-bjkc2\" (UID: \"27545afc-bda6-468c-b9c2-8ab3182546c8\") " pod="metallb-system/metallb-operator-controller-manager-79cd97f75d-bjkc2" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.653711 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/27545afc-bda6-468c-b9c2-8ab3182546c8-webhook-cert\") pod \"metallb-operator-controller-manager-79cd97f75d-bjkc2\" (UID: \"27545afc-bda6-468c-b9c2-8ab3182546c8\") " pod="metallb-system/metallb-operator-controller-manager-79cd97f75d-bjkc2" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.653813 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/27545afc-bda6-468c-b9c2-8ab3182546c8-apiservice-cert\") pod \"metallb-operator-controller-manager-79cd97f75d-bjkc2\" (UID: \"27545afc-bda6-468c-b9c2-8ab3182546c8\") " pod="metallb-system/metallb-operator-controller-manager-79cd97f75d-bjkc2" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.653876 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67mtf\" (UniqueName: \"kubernetes.io/projected/27545afc-bda6-468c-b9c2-8ab3182546c8-kube-api-access-67mtf\") pod \"metallb-operator-controller-manager-79cd97f75d-bjkc2\" (UID: \"27545afc-bda6-468c-b9c2-8ab3182546c8\") " pod="metallb-system/metallb-operator-controller-manager-79cd97f75d-bjkc2" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.663221 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/27545afc-bda6-468c-b9c2-8ab3182546c8-webhook-cert\") pod \"metallb-operator-controller-manager-79cd97f75d-bjkc2\" (UID: \"27545afc-bda6-468c-b9c2-8ab3182546c8\") " pod="metallb-system/metallb-operator-controller-manager-79cd97f75d-bjkc2" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.664818 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/27545afc-bda6-468c-b9c2-8ab3182546c8-apiservice-cert\") pod \"metallb-operator-controller-manager-79cd97f75d-bjkc2\" (UID: \"27545afc-bda6-468c-b9c2-8ab3182546c8\") " pod="metallb-system/metallb-operator-controller-manager-79cd97f75d-bjkc2" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.674060 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67mtf\" (UniqueName: \"kubernetes.io/projected/27545afc-bda6-468c-b9c2-8ab3182546c8-kube-api-access-67mtf\") pod \"metallb-operator-controller-manager-79cd97f75d-bjkc2\" (UID: \"27545afc-bda6-468c-b9c2-8ab3182546c8\") " pod="metallb-system/metallb-operator-controller-manager-79cd97f75d-bjkc2" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.710665 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-d94989d67-mpm9p"] Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.711746 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-d94989d67-mpm9p" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.714570 4802 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-hjs6p" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.714764 4802 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 01 20:11:51 crc kubenswrapper[4802]: I1201 20:11:51.714889 4802 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 01 20:11:52 crc kubenswrapper[4802]: I1201 20:11:52.489894 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-79cd97f75d-bjkc2" Dec 01 20:11:52 crc kubenswrapper[4802]: I1201 20:11:52.495009 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf6ml\" (UniqueName: \"kubernetes.io/projected/0ca0e887-c648-46c5-941a-96fc3a8e551e-kube-api-access-mf6ml\") pod \"metallb-operator-webhook-server-d94989d67-mpm9p\" (UID: \"0ca0e887-c648-46c5-941a-96fc3a8e551e\") " pod="metallb-system/metallb-operator-webhook-server-d94989d67-mpm9p" Dec 01 20:11:52 crc kubenswrapper[4802]: I1201 20:11:52.495078 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0ca0e887-c648-46c5-941a-96fc3a8e551e-apiservice-cert\") pod \"metallb-operator-webhook-server-d94989d67-mpm9p\" (UID: \"0ca0e887-c648-46c5-941a-96fc3a8e551e\") " pod="metallb-system/metallb-operator-webhook-server-d94989d67-mpm9p" Dec 01 20:11:52 crc kubenswrapper[4802]: I1201 20:11:52.495324 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0ca0e887-c648-46c5-941a-96fc3a8e551e-webhook-cert\") pod \"metallb-operator-webhook-server-d94989d67-mpm9p\" (UID: \"0ca0e887-c648-46c5-941a-96fc3a8e551e\") " pod="metallb-system/metallb-operator-webhook-server-d94989d67-mpm9p" Dec 01 20:11:52 crc kubenswrapper[4802]: I1201 20:11:52.599206 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0ca0e887-c648-46c5-941a-96fc3a8e551e-apiservice-cert\") pod \"metallb-operator-webhook-server-d94989d67-mpm9p\" (UID: \"0ca0e887-c648-46c5-941a-96fc3a8e551e\") " pod="metallb-system/metallb-operator-webhook-server-d94989d67-mpm9p" Dec 01 20:11:52 crc kubenswrapper[4802]: I1201 20:11:52.599769 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0ca0e887-c648-46c5-941a-96fc3a8e551e-webhook-cert\") pod \"metallb-operator-webhook-server-d94989d67-mpm9p\" (UID: \"0ca0e887-c648-46c5-941a-96fc3a8e551e\") " pod="metallb-system/metallb-operator-webhook-server-d94989d67-mpm9p" Dec 01 20:11:52 crc kubenswrapper[4802]: I1201 20:11:52.599840 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf6ml\" (UniqueName: \"kubernetes.io/projected/0ca0e887-c648-46c5-941a-96fc3a8e551e-kube-api-access-mf6ml\") pod \"metallb-operator-webhook-server-d94989d67-mpm9p\" (UID: \"0ca0e887-c648-46c5-941a-96fc3a8e551e\") " pod="metallb-system/metallb-operator-webhook-server-d94989d67-mpm9p" Dec 01 20:11:52 crc kubenswrapper[4802]: I1201 20:11:52.600575 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-d94989d67-mpm9p"] Dec 01 20:11:52 crc kubenswrapper[4802]: I1201 20:11:52.605255 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0ca0e887-c648-46c5-941a-96fc3a8e551e-apiservice-cert\") pod \"metallb-operator-webhook-server-d94989d67-mpm9p\" (UID: \"0ca0e887-c648-46c5-941a-96fc3a8e551e\") " pod="metallb-system/metallb-operator-webhook-server-d94989d67-mpm9p" Dec 01 20:11:52 crc kubenswrapper[4802]: I1201 20:11:52.606763 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0ca0e887-c648-46c5-941a-96fc3a8e551e-webhook-cert\") pod \"metallb-operator-webhook-server-d94989d67-mpm9p\" (UID: \"0ca0e887-c648-46c5-941a-96fc3a8e551e\") " pod="metallb-system/metallb-operator-webhook-server-d94989d67-mpm9p" Dec 01 20:11:52 crc kubenswrapper[4802]: I1201 20:11:52.628304 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf6ml\" (UniqueName: \"kubernetes.io/projected/0ca0e887-c648-46c5-941a-96fc3a8e551e-kube-api-access-mf6ml\") pod \"metallb-operator-webhook-server-d94989d67-mpm9p\" (UID: \"0ca0e887-c648-46c5-941a-96fc3a8e551e\") " pod="metallb-system/metallb-operator-webhook-server-d94989d67-mpm9p" Dec 01 20:11:52 crc kubenswrapper[4802]: I1201 20:11:52.634037 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-d94989d67-mpm9p" Dec 01 20:11:52 crc kubenswrapper[4802]: I1201 20:11:52.856218 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-79cd97f75d-bjkc2"] Dec 01 20:11:52 crc kubenswrapper[4802]: I1201 20:11:52.947084 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-d94989d67-mpm9p"] Dec 01 20:11:52 crc kubenswrapper[4802]: W1201 20:11:52.958676 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ca0e887_c648_46c5_941a_96fc3a8e551e.slice/crio-2fba0ba147d5e3bfaa2c4f874fbafa8b53c8b0f1ab4e345d2a7a77f80c314382 WatchSource:0}: Error finding container 2fba0ba147d5e3bfaa2c4f874fbafa8b53c8b0f1ab4e345d2a7a77f80c314382: Status 404 returned error can't find the container with id 2fba0ba147d5e3bfaa2c4f874fbafa8b53c8b0f1ab4e345d2a7a77f80c314382 Dec 01 20:11:53 crc kubenswrapper[4802]: I1201 20:11:53.608219 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-d94989d67-mpm9p" event={"ID":"0ca0e887-c648-46c5-941a-96fc3a8e551e","Type":"ContainerStarted","Data":"2fba0ba147d5e3bfaa2c4f874fbafa8b53c8b0f1ab4e345d2a7a77f80c314382"} Dec 01 20:11:53 crc kubenswrapper[4802]: I1201 20:11:53.609941 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-79cd97f75d-bjkc2" event={"ID":"27545afc-bda6-468c-b9c2-8ab3182546c8","Type":"ContainerStarted","Data":"b0337b0964a61597509bbeb4ab2a9d0ee944440b87886657c873958d794bcf83"} Dec 01 20:11:57 crc kubenswrapper[4802]: I1201 20:11:57.641486 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-79cd97f75d-bjkc2" event={"ID":"27545afc-bda6-468c-b9c2-8ab3182546c8","Type":"ContainerStarted","Data":"9afbe44e064f45c1c93ffd442b6c6670d82d2c51d90a7729549d7a6ad54086ca"} Dec 01 20:11:57 crc kubenswrapper[4802]: I1201 20:11:57.642472 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-79cd97f75d-bjkc2" Dec 01 20:11:57 crc kubenswrapper[4802]: I1201 20:11:57.668122 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-79cd97f75d-bjkc2" podStartSLOduration=3.031512386 podStartE2EDuration="6.668098378s" podCreationTimestamp="2025-12-01 20:11:51 +0000 UTC" firstStartedPulling="2025-12-01 20:11:52.875441978 +0000 UTC m=+934.438001619" lastFinishedPulling="2025-12-01 20:11:56.51202797 +0000 UTC m=+938.074587611" observedRunningTime="2025-12-01 20:11:57.664780824 +0000 UTC m=+939.227340465" watchObservedRunningTime="2025-12-01 20:11:57.668098378 +0000 UTC m=+939.230658019" Dec 01 20:11:58 crc kubenswrapper[4802]: I1201 20:11:58.088843 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:11:58 crc kubenswrapper[4802]: I1201 20:11:58.089607 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:11:59 crc kubenswrapper[4802]: I1201 20:11:59.662732 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-d94989d67-mpm9p" event={"ID":"0ca0e887-c648-46c5-941a-96fc3a8e551e","Type":"ContainerStarted","Data":"18e916bdc07a2afdd06316c586cd2869f5472bb7695fc197c7bcf34b3f5ec13c"} Dec 01 20:11:59 crc kubenswrapper[4802]: I1201 20:11:59.663558 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-d94989d67-mpm9p" Dec 01 20:11:59 crc kubenswrapper[4802]: I1201 20:11:59.686612 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-d94989d67-mpm9p" podStartSLOduration=2.539082025 podStartE2EDuration="8.686575934s" podCreationTimestamp="2025-12-01 20:11:51 +0000 UTC" firstStartedPulling="2025-12-01 20:11:52.965150475 +0000 UTC m=+934.527710116" lastFinishedPulling="2025-12-01 20:11:59.112644384 +0000 UTC m=+940.675204025" observedRunningTime="2025-12-01 20:11:59.684310253 +0000 UTC m=+941.246869914" watchObservedRunningTime="2025-12-01 20:11:59.686575934 +0000 UTC m=+941.249135575" Dec 01 20:12:07 crc kubenswrapper[4802]: I1201 20:12:07.535870 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dct4z"] Dec 01 20:12:07 crc kubenswrapper[4802]: I1201 20:12:07.538515 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dct4z" Dec 01 20:12:07 crc kubenswrapper[4802]: I1201 20:12:07.556126 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dct4z"] Dec 01 20:12:07 crc kubenswrapper[4802]: I1201 20:12:07.605686 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f3ab0a7-4f00-45be-b5b8-69891cdef4a0-utilities\") pod \"certified-operators-dct4z\" (UID: \"7f3ab0a7-4f00-45be-b5b8-69891cdef4a0\") " pod="openshift-marketplace/certified-operators-dct4z" Dec 01 20:12:07 crc kubenswrapper[4802]: I1201 20:12:07.605769 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f3ab0a7-4f00-45be-b5b8-69891cdef4a0-catalog-content\") pod \"certified-operators-dct4z\" (UID: \"7f3ab0a7-4f00-45be-b5b8-69891cdef4a0\") " pod="openshift-marketplace/certified-operators-dct4z" Dec 01 20:12:07 crc kubenswrapper[4802]: I1201 20:12:07.605812 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgjqc\" (UniqueName: \"kubernetes.io/projected/7f3ab0a7-4f00-45be-b5b8-69891cdef4a0-kube-api-access-bgjqc\") pod \"certified-operators-dct4z\" (UID: \"7f3ab0a7-4f00-45be-b5b8-69891cdef4a0\") " pod="openshift-marketplace/certified-operators-dct4z" Dec 01 20:12:07 crc kubenswrapper[4802]: I1201 20:12:07.707201 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f3ab0a7-4f00-45be-b5b8-69891cdef4a0-utilities\") pod \"certified-operators-dct4z\" (UID: \"7f3ab0a7-4f00-45be-b5b8-69891cdef4a0\") " pod="openshift-marketplace/certified-operators-dct4z" Dec 01 20:12:07 crc kubenswrapper[4802]: I1201 20:12:07.707745 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f3ab0a7-4f00-45be-b5b8-69891cdef4a0-catalog-content\") pod \"certified-operators-dct4z\" (UID: \"7f3ab0a7-4f00-45be-b5b8-69891cdef4a0\") " pod="openshift-marketplace/certified-operators-dct4z" Dec 01 20:12:07 crc kubenswrapper[4802]: I1201 20:12:07.707778 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgjqc\" (UniqueName: \"kubernetes.io/projected/7f3ab0a7-4f00-45be-b5b8-69891cdef4a0-kube-api-access-bgjqc\") pod \"certified-operators-dct4z\" (UID: \"7f3ab0a7-4f00-45be-b5b8-69891cdef4a0\") " pod="openshift-marketplace/certified-operators-dct4z" Dec 01 20:12:07 crc kubenswrapper[4802]: I1201 20:12:07.707969 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f3ab0a7-4f00-45be-b5b8-69891cdef4a0-utilities\") pod \"certified-operators-dct4z\" (UID: \"7f3ab0a7-4f00-45be-b5b8-69891cdef4a0\") " pod="openshift-marketplace/certified-operators-dct4z" Dec 01 20:12:07 crc kubenswrapper[4802]: I1201 20:12:07.708433 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f3ab0a7-4f00-45be-b5b8-69891cdef4a0-catalog-content\") pod \"certified-operators-dct4z\" (UID: \"7f3ab0a7-4f00-45be-b5b8-69891cdef4a0\") " pod="openshift-marketplace/certified-operators-dct4z" Dec 01 20:12:07 crc kubenswrapper[4802]: I1201 20:12:07.745147 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgjqc\" (UniqueName: \"kubernetes.io/projected/7f3ab0a7-4f00-45be-b5b8-69891cdef4a0-kube-api-access-bgjqc\") pod \"certified-operators-dct4z\" (UID: \"7f3ab0a7-4f00-45be-b5b8-69891cdef4a0\") " pod="openshift-marketplace/certified-operators-dct4z" Dec 01 20:12:07 crc kubenswrapper[4802]: I1201 20:12:07.864111 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dct4z" Dec 01 20:12:08 crc kubenswrapper[4802]: I1201 20:12:08.442483 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dct4z"] Dec 01 20:12:08 crc kubenswrapper[4802]: I1201 20:12:08.729137 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dct4z" event={"ID":"7f3ab0a7-4f00-45be-b5b8-69891cdef4a0","Type":"ContainerStarted","Data":"0b88d16c2b4e0560849a3b49102580458990770a1875cb6966bdb36786efe953"} Dec 01 20:12:09 crc kubenswrapper[4802]: I1201 20:12:09.737665 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dct4z" event={"ID":"7f3ab0a7-4f00-45be-b5b8-69891cdef4a0","Type":"ContainerStarted","Data":"57d2736466c2adbeddde2acad500c62d7fd0511cdeabb5ad7cafdd555c84b178"} Dec 01 20:12:10 crc kubenswrapper[4802]: I1201 20:12:10.747325 4802 generic.go:334] "Generic (PLEG): container finished" podID="7f3ab0a7-4f00-45be-b5b8-69891cdef4a0" containerID="57d2736466c2adbeddde2acad500c62d7fd0511cdeabb5ad7cafdd555c84b178" exitCode=0 Dec 01 20:12:10 crc kubenswrapper[4802]: I1201 20:12:10.747380 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dct4z" event={"ID":"7f3ab0a7-4f00-45be-b5b8-69891cdef4a0","Type":"ContainerDied","Data":"57d2736466c2adbeddde2acad500c62d7fd0511cdeabb5ad7cafdd555c84b178"} Dec 01 20:12:12 crc kubenswrapper[4802]: I1201 20:12:12.641977 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-d94989d67-mpm9p" Dec 01 20:12:12 crc kubenswrapper[4802]: I1201 20:12:12.775333 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dct4z" event={"ID":"7f3ab0a7-4f00-45be-b5b8-69891cdef4a0","Type":"ContainerStarted","Data":"8759eab1b0f321c7afb23c5c6d973c919909eada58c0ec2e0378767ae9d727d9"} Dec 01 20:12:13 crc kubenswrapper[4802]: I1201 20:12:13.788993 4802 generic.go:334] "Generic (PLEG): container finished" podID="7f3ab0a7-4f00-45be-b5b8-69891cdef4a0" containerID="8759eab1b0f321c7afb23c5c6d973c919909eada58c0ec2e0378767ae9d727d9" exitCode=0 Dec 01 20:12:13 crc kubenswrapper[4802]: I1201 20:12:13.789075 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dct4z" event={"ID":"7f3ab0a7-4f00-45be-b5b8-69891cdef4a0","Type":"ContainerDied","Data":"8759eab1b0f321c7afb23c5c6d973c919909eada58c0ec2e0378767ae9d727d9"} Dec 01 20:12:15 crc kubenswrapper[4802]: I1201 20:12:15.805071 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dct4z" event={"ID":"7f3ab0a7-4f00-45be-b5b8-69891cdef4a0","Type":"ContainerStarted","Data":"14e73acc1f6ee364db1d4ab35c54d7c5e3bc62cc5a0e09928538d38652aaf754"} Dec 01 20:12:15 crc kubenswrapper[4802]: I1201 20:12:15.829951 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dct4z" podStartSLOduration=3.992168668 podStartE2EDuration="8.829923045s" podCreationTimestamp="2025-12-01 20:12:07 +0000 UTC" firstStartedPulling="2025-12-01 20:12:10.749467448 +0000 UTC m=+952.312027089" lastFinishedPulling="2025-12-01 20:12:15.587221825 +0000 UTC m=+957.149781466" observedRunningTime="2025-12-01 20:12:15.823759362 +0000 UTC m=+957.386319013" watchObservedRunningTime="2025-12-01 20:12:15.829923045 +0000 UTC m=+957.392482686" Dec 01 20:12:17 crc kubenswrapper[4802]: I1201 20:12:17.864427 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dct4z" Dec 01 20:12:17 crc kubenswrapper[4802]: I1201 20:12:17.864915 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dct4z" Dec 01 20:12:17 crc kubenswrapper[4802]: I1201 20:12:17.907058 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dct4z" Dec 01 20:12:27 crc kubenswrapper[4802]: I1201 20:12:27.913478 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dct4z" Dec 01 20:12:28 crc kubenswrapper[4802]: I1201 20:12:28.088917 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:12:28 crc kubenswrapper[4802]: I1201 20:12:28.089034 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:12:28 crc kubenswrapper[4802]: I1201 20:12:28.089105 4802 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 20:12:28 crc kubenswrapper[4802]: I1201 20:12:28.090044 4802 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b5dce8f18ad191d77889a5744c4581ae578cf1b86f80207070351f56ae6cf862"} pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 20:12:28 crc kubenswrapper[4802]: I1201 20:12:28.090122 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" containerID="cri-o://b5dce8f18ad191d77889a5744c4581ae578cf1b86f80207070351f56ae6cf862" gracePeriod=600 Dec 01 20:12:28 crc kubenswrapper[4802]: I1201 20:12:28.907682 4802 generic.go:334] "Generic (PLEG): container finished" podID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerID="b5dce8f18ad191d77889a5744c4581ae578cf1b86f80207070351f56ae6cf862" exitCode=0 Dec 01 20:12:28 crc kubenswrapper[4802]: I1201 20:12:28.907749 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerDied","Data":"b5dce8f18ad191d77889a5744c4581ae578cf1b86f80207070351f56ae6cf862"} Dec 01 20:12:28 crc kubenswrapper[4802]: I1201 20:12:28.908747 4802 scope.go:117] "RemoveContainer" containerID="7b877900cbea92263f9945a8b0d73242a0986bd9839c145102aaab83d242fee9" Dec 01 20:12:29 crc kubenswrapper[4802]: I1201 20:12:29.920574 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerStarted","Data":"cdab3d21e9b678aa33ac2623e4e7233c7b5184d7898d89b02a2bb479a6f32dd9"} Dec 01 20:12:30 crc kubenswrapper[4802]: I1201 20:12:30.329485 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dct4z"] Dec 01 20:12:30 crc kubenswrapper[4802]: I1201 20:12:30.330050 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dct4z" podUID="7f3ab0a7-4f00-45be-b5b8-69891cdef4a0" containerName="registry-server" containerID="cri-o://14e73acc1f6ee364db1d4ab35c54d7c5e3bc62cc5a0e09928538d38652aaf754" gracePeriod=2 Dec 01 20:12:30 crc kubenswrapper[4802]: I1201 20:12:30.777889 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dct4z" Dec 01 20:12:30 crc kubenswrapper[4802]: I1201 20:12:30.921054 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgjqc\" (UniqueName: \"kubernetes.io/projected/7f3ab0a7-4f00-45be-b5b8-69891cdef4a0-kube-api-access-bgjqc\") pod \"7f3ab0a7-4f00-45be-b5b8-69891cdef4a0\" (UID: \"7f3ab0a7-4f00-45be-b5b8-69891cdef4a0\") " Dec 01 20:12:30 crc kubenswrapper[4802]: I1201 20:12:30.921142 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f3ab0a7-4f00-45be-b5b8-69891cdef4a0-utilities\") pod \"7f3ab0a7-4f00-45be-b5b8-69891cdef4a0\" (UID: \"7f3ab0a7-4f00-45be-b5b8-69891cdef4a0\") " Dec 01 20:12:30 crc kubenswrapper[4802]: I1201 20:12:30.921298 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f3ab0a7-4f00-45be-b5b8-69891cdef4a0-catalog-content\") pod \"7f3ab0a7-4f00-45be-b5b8-69891cdef4a0\" (UID: \"7f3ab0a7-4f00-45be-b5b8-69891cdef4a0\") " Dec 01 20:12:30 crc kubenswrapper[4802]: I1201 20:12:30.922485 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f3ab0a7-4f00-45be-b5b8-69891cdef4a0-utilities" (OuterVolumeSpecName: "utilities") pod "7f3ab0a7-4f00-45be-b5b8-69891cdef4a0" (UID: "7f3ab0a7-4f00-45be-b5b8-69891cdef4a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:12:30 crc kubenswrapper[4802]: I1201 20:12:30.934596 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f3ab0a7-4f00-45be-b5b8-69891cdef4a0-kube-api-access-bgjqc" (OuterVolumeSpecName: "kube-api-access-bgjqc") pod "7f3ab0a7-4f00-45be-b5b8-69891cdef4a0" (UID: "7f3ab0a7-4f00-45be-b5b8-69891cdef4a0"). InnerVolumeSpecName "kube-api-access-bgjqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:12:30 crc kubenswrapper[4802]: I1201 20:12:30.940440 4802 generic.go:334] "Generic (PLEG): container finished" podID="7f3ab0a7-4f00-45be-b5b8-69891cdef4a0" containerID="14e73acc1f6ee364db1d4ab35c54d7c5e3bc62cc5a0e09928538d38652aaf754" exitCode=0 Dec 01 20:12:30 crc kubenswrapper[4802]: I1201 20:12:30.940516 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dct4z" Dec 01 20:12:30 crc kubenswrapper[4802]: I1201 20:12:30.940592 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dct4z" event={"ID":"7f3ab0a7-4f00-45be-b5b8-69891cdef4a0","Type":"ContainerDied","Data":"14e73acc1f6ee364db1d4ab35c54d7c5e3bc62cc5a0e09928538d38652aaf754"} Dec 01 20:12:30 crc kubenswrapper[4802]: I1201 20:12:30.940708 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dct4z" event={"ID":"7f3ab0a7-4f00-45be-b5b8-69891cdef4a0","Type":"ContainerDied","Data":"0b88d16c2b4e0560849a3b49102580458990770a1875cb6966bdb36786efe953"} Dec 01 20:12:30 crc kubenswrapper[4802]: I1201 20:12:30.940738 4802 scope.go:117] "RemoveContainer" containerID="14e73acc1f6ee364db1d4ab35c54d7c5e3bc62cc5a0e09928538d38652aaf754" Dec 01 20:12:30 crc kubenswrapper[4802]: I1201 20:12:30.962997 4802 scope.go:117] "RemoveContainer" containerID="8759eab1b0f321c7afb23c5c6d973c919909eada58c0ec2e0378767ae9d727d9" Dec 01 20:12:30 crc kubenswrapper[4802]: I1201 20:12:30.971333 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f3ab0a7-4f00-45be-b5b8-69891cdef4a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7f3ab0a7-4f00-45be-b5b8-69891cdef4a0" (UID: "7f3ab0a7-4f00-45be-b5b8-69891cdef4a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:12:30 crc kubenswrapper[4802]: I1201 20:12:30.981610 4802 scope.go:117] "RemoveContainer" containerID="57d2736466c2adbeddde2acad500c62d7fd0511cdeabb5ad7cafdd555c84b178" Dec 01 20:12:30 crc kubenswrapper[4802]: I1201 20:12:30.999428 4802 scope.go:117] "RemoveContainer" containerID="14e73acc1f6ee364db1d4ab35c54d7c5e3bc62cc5a0e09928538d38652aaf754" Dec 01 20:12:31 crc kubenswrapper[4802]: E1201 20:12:31.000141 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14e73acc1f6ee364db1d4ab35c54d7c5e3bc62cc5a0e09928538d38652aaf754\": container with ID starting with 14e73acc1f6ee364db1d4ab35c54d7c5e3bc62cc5a0e09928538d38652aaf754 not found: ID does not exist" containerID="14e73acc1f6ee364db1d4ab35c54d7c5e3bc62cc5a0e09928538d38652aaf754" Dec 01 20:12:31 crc kubenswrapper[4802]: I1201 20:12:31.000247 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14e73acc1f6ee364db1d4ab35c54d7c5e3bc62cc5a0e09928538d38652aaf754"} err="failed to get container status \"14e73acc1f6ee364db1d4ab35c54d7c5e3bc62cc5a0e09928538d38652aaf754\": rpc error: code = NotFound desc = could not find container \"14e73acc1f6ee364db1d4ab35c54d7c5e3bc62cc5a0e09928538d38652aaf754\": container with ID starting with 14e73acc1f6ee364db1d4ab35c54d7c5e3bc62cc5a0e09928538d38652aaf754 not found: ID does not exist" Dec 01 20:12:31 crc kubenswrapper[4802]: I1201 20:12:31.000304 4802 scope.go:117] "RemoveContainer" containerID="8759eab1b0f321c7afb23c5c6d973c919909eada58c0ec2e0378767ae9d727d9" Dec 01 20:12:31 crc kubenswrapper[4802]: E1201 20:12:31.001026 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8759eab1b0f321c7afb23c5c6d973c919909eada58c0ec2e0378767ae9d727d9\": container with ID starting with 8759eab1b0f321c7afb23c5c6d973c919909eada58c0ec2e0378767ae9d727d9 not found: ID does not exist" containerID="8759eab1b0f321c7afb23c5c6d973c919909eada58c0ec2e0378767ae9d727d9" Dec 01 20:12:31 crc kubenswrapper[4802]: I1201 20:12:31.001068 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8759eab1b0f321c7afb23c5c6d973c919909eada58c0ec2e0378767ae9d727d9"} err="failed to get container status \"8759eab1b0f321c7afb23c5c6d973c919909eada58c0ec2e0378767ae9d727d9\": rpc error: code = NotFound desc = could not find container \"8759eab1b0f321c7afb23c5c6d973c919909eada58c0ec2e0378767ae9d727d9\": container with ID starting with 8759eab1b0f321c7afb23c5c6d973c919909eada58c0ec2e0378767ae9d727d9 not found: ID does not exist" Dec 01 20:12:31 crc kubenswrapper[4802]: I1201 20:12:31.001111 4802 scope.go:117] "RemoveContainer" containerID="57d2736466c2adbeddde2acad500c62d7fd0511cdeabb5ad7cafdd555c84b178" Dec 01 20:12:31 crc kubenswrapper[4802]: E1201 20:12:31.001725 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57d2736466c2adbeddde2acad500c62d7fd0511cdeabb5ad7cafdd555c84b178\": container with ID starting with 57d2736466c2adbeddde2acad500c62d7fd0511cdeabb5ad7cafdd555c84b178 not found: ID does not exist" containerID="57d2736466c2adbeddde2acad500c62d7fd0511cdeabb5ad7cafdd555c84b178" Dec 01 20:12:31 crc kubenswrapper[4802]: I1201 20:12:31.001842 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57d2736466c2adbeddde2acad500c62d7fd0511cdeabb5ad7cafdd555c84b178"} err="failed to get container status \"57d2736466c2adbeddde2acad500c62d7fd0511cdeabb5ad7cafdd555c84b178\": rpc error: code = NotFound desc = could not find container \"57d2736466c2adbeddde2acad500c62d7fd0511cdeabb5ad7cafdd555c84b178\": container with ID starting with 57d2736466c2adbeddde2acad500c62d7fd0511cdeabb5ad7cafdd555c84b178 not found: ID does not exist" Dec 01 20:12:31 crc kubenswrapper[4802]: I1201 20:12:31.023434 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f3ab0a7-4f00-45be-b5b8-69891cdef4a0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 20:12:31 crc kubenswrapper[4802]: I1201 20:12:31.023507 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgjqc\" (UniqueName: \"kubernetes.io/projected/7f3ab0a7-4f00-45be-b5b8-69891cdef4a0-kube-api-access-bgjqc\") on node \"crc\" DevicePath \"\"" Dec 01 20:12:31 crc kubenswrapper[4802]: I1201 20:12:31.023532 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f3ab0a7-4f00-45be-b5b8-69891cdef4a0-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 20:12:31 crc kubenswrapper[4802]: I1201 20:12:31.272319 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dct4z"] Dec 01 20:12:31 crc kubenswrapper[4802]: I1201 20:12:31.275591 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dct4z"] Dec 01 20:12:32 crc kubenswrapper[4802]: I1201 20:12:32.494826 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-79cd97f75d-bjkc2" Dec 01 20:12:32 crc kubenswrapper[4802]: I1201 20:12:32.731504 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f3ab0a7-4f00-45be-b5b8-69891cdef4a0" path="/var/lib/kubelet/pods/7f3ab0a7-4f00-45be-b5b8-69891cdef4a0/volumes" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.252322 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-btwhk"] Dec 01 20:12:33 crc kubenswrapper[4802]: E1201 20:12:33.252829 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3ab0a7-4f00-45be-b5b8-69891cdef4a0" containerName="registry-server" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.252858 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3ab0a7-4f00-45be-b5b8-69891cdef4a0" containerName="registry-server" Dec 01 20:12:33 crc kubenswrapper[4802]: E1201 20:12:33.252873 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3ab0a7-4f00-45be-b5b8-69891cdef4a0" containerName="extract-utilities" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.252882 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3ab0a7-4f00-45be-b5b8-69891cdef4a0" containerName="extract-utilities" Dec 01 20:12:33 crc kubenswrapper[4802]: E1201 20:12:33.252895 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3ab0a7-4f00-45be-b5b8-69891cdef4a0" containerName="extract-content" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.252903 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3ab0a7-4f00-45be-b5b8-69891cdef4a0" containerName="extract-content" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.253087 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f3ab0a7-4f00-45be-b5b8-69891cdef4a0" containerName="registry-server" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.256047 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-5rvzc"] Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.256745 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5rvzc" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.257424 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-btwhk" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.261357 4802 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-rmjg2" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.261706 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.262216 4802 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.262421 4802 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.333679 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-5rvzc"] Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.367175 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ca5ccff0-46eb-46dd-aa6b-a0069276275d-frr-sockets\") pod \"frr-k8s-btwhk\" (UID: \"ca5ccff0-46eb-46dd-aa6b-a0069276275d\") " pod="metallb-system/frr-k8s-btwhk" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.367256 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/14b6cfe2-8222-45da-808e-2a3d64d13b94-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-5rvzc\" (UID: \"14b6cfe2-8222-45da-808e-2a3d64d13b94\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5rvzc" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.367286 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z42ss\" (UniqueName: \"kubernetes.io/projected/ca5ccff0-46eb-46dd-aa6b-a0069276275d-kube-api-access-z42ss\") pod \"frr-k8s-btwhk\" (UID: \"ca5ccff0-46eb-46dd-aa6b-a0069276275d\") " pod="metallb-system/frr-k8s-btwhk" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.367324 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ca5ccff0-46eb-46dd-aa6b-a0069276275d-reloader\") pod \"frr-k8s-btwhk\" (UID: \"ca5ccff0-46eb-46dd-aa6b-a0069276275d\") " pod="metallb-system/frr-k8s-btwhk" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.367433 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ca5ccff0-46eb-46dd-aa6b-a0069276275d-frr-conf\") pod \"frr-k8s-btwhk\" (UID: \"ca5ccff0-46eb-46dd-aa6b-a0069276275d\") " pod="metallb-system/frr-k8s-btwhk" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.367455 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ca5ccff0-46eb-46dd-aa6b-a0069276275d-metrics\") pod \"frr-k8s-btwhk\" (UID: \"ca5ccff0-46eb-46dd-aa6b-a0069276275d\") " pod="metallb-system/frr-k8s-btwhk" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.367597 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtft2\" (UniqueName: \"kubernetes.io/projected/14b6cfe2-8222-45da-808e-2a3d64d13b94-kube-api-access-vtft2\") pod \"frr-k8s-webhook-server-7fcb986d4-5rvzc\" (UID: \"14b6cfe2-8222-45da-808e-2a3d64d13b94\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5rvzc" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.367730 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca5ccff0-46eb-46dd-aa6b-a0069276275d-metrics-certs\") pod \"frr-k8s-btwhk\" (UID: \"ca5ccff0-46eb-46dd-aa6b-a0069276275d\") " pod="metallb-system/frr-k8s-btwhk" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.367794 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ca5ccff0-46eb-46dd-aa6b-a0069276275d-frr-startup\") pod \"frr-k8s-btwhk\" (UID: \"ca5ccff0-46eb-46dd-aa6b-a0069276275d\") " pod="metallb-system/frr-k8s-btwhk" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.386172 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-zjgml"] Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.387079 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zjgml" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.397877 4802 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.398057 4802 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.398243 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.398350 4802 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-b59x9" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.431534 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-x5zbd"] Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.433424 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-x5zbd" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.438575 4802 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.461337 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-x5zbd"] Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.469335 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ca5ccff0-46eb-46dd-aa6b-a0069276275d-metrics\") pod \"frr-k8s-btwhk\" (UID: \"ca5ccff0-46eb-46dd-aa6b-a0069276275d\") " pod="metallb-system/frr-k8s-btwhk" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.469475 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtft2\" (UniqueName: \"kubernetes.io/projected/14b6cfe2-8222-45da-808e-2a3d64d13b94-kube-api-access-vtft2\") pod \"frr-k8s-webhook-server-7fcb986d4-5rvzc\" (UID: \"14b6cfe2-8222-45da-808e-2a3d64d13b94\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5rvzc" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.469509 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxqqs\" (UniqueName: \"kubernetes.io/projected/4c77b924-7d9d-48b6-9e00-476f7df7104c-kube-api-access-kxqqs\") pod \"speaker-zjgml\" (UID: \"4c77b924-7d9d-48b6-9e00-476f7df7104c\") " pod="metallb-system/speaker-zjgml" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.469543 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca5ccff0-46eb-46dd-aa6b-a0069276275d-metrics-certs\") pod \"frr-k8s-btwhk\" (UID: \"ca5ccff0-46eb-46dd-aa6b-a0069276275d\") " pod="metallb-system/frr-k8s-btwhk" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.469571 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ca5ccff0-46eb-46dd-aa6b-a0069276275d-frr-startup\") pod \"frr-k8s-btwhk\" (UID: \"ca5ccff0-46eb-46dd-aa6b-a0069276275d\") " pod="metallb-system/frr-k8s-btwhk" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.469634 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c77b924-7d9d-48b6-9e00-476f7df7104c-metrics-certs\") pod \"speaker-zjgml\" (UID: \"4c77b924-7d9d-48b6-9e00-476f7df7104c\") " pod="metallb-system/speaker-zjgml" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.469657 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ca5ccff0-46eb-46dd-aa6b-a0069276275d-frr-sockets\") pod \"frr-k8s-btwhk\" (UID: \"ca5ccff0-46eb-46dd-aa6b-a0069276275d\") " pod="metallb-system/frr-k8s-btwhk" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.469686 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/14b6cfe2-8222-45da-808e-2a3d64d13b94-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-5rvzc\" (UID: \"14b6cfe2-8222-45da-808e-2a3d64d13b94\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5rvzc" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.469708 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4c77b924-7d9d-48b6-9e00-476f7df7104c-memberlist\") pod \"speaker-zjgml\" (UID: \"4c77b924-7d9d-48b6-9e00-476f7df7104c\") " pod="metallb-system/speaker-zjgml" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.469730 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z42ss\" (UniqueName: \"kubernetes.io/projected/ca5ccff0-46eb-46dd-aa6b-a0069276275d-kube-api-access-z42ss\") pod \"frr-k8s-btwhk\" (UID: \"ca5ccff0-46eb-46dd-aa6b-a0069276275d\") " pod="metallb-system/frr-k8s-btwhk" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.469763 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ca5ccff0-46eb-46dd-aa6b-a0069276275d-reloader\") pod \"frr-k8s-btwhk\" (UID: \"ca5ccff0-46eb-46dd-aa6b-a0069276275d\") " pod="metallb-system/frr-k8s-btwhk" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.469786 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4c77b924-7d9d-48b6-9e00-476f7df7104c-metallb-excludel2\") pod \"speaker-zjgml\" (UID: \"4c77b924-7d9d-48b6-9e00-476f7df7104c\") " pod="metallb-system/speaker-zjgml" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.469816 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ca5ccff0-46eb-46dd-aa6b-a0069276275d-frr-conf\") pod \"frr-k8s-btwhk\" (UID: \"ca5ccff0-46eb-46dd-aa6b-a0069276275d\") " pod="metallb-system/frr-k8s-btwhk" Dec 01 20:12:33 crc kubenswrapper[4802]: E1201 20:12:33.469926 4802 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Dec 01 20:12:33 crc kubenswrapper[4802]: E1201 20:12:33.470037 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14b6cfe2-8222-45da-808e-2a3d64d13b94-cert podName:14b6cfe2-8222-45da-808e-2a3d64d13b94 nodeName:}" failed. No retries permitted until 2025-12-01 20:12:33.970006229 +0000 UTC m=+975.532565870 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/14b6cfe2-8222-45da-808e-2a3d64d13b94-cert") pod "frr-k8s-webhook-server-7fcb986d4-5rvzc" (UID: "14b6cfe2-8222-45da-808e-2a3d64d13b94") : secret "frr-k8s-webhook-server-cert" not found Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.470534 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ca5ccff0-46eb-46dd-aa6b-a0069276275d-frr-sockets\") pod \"frr-k8s-btwhk\" (UID: \"ca5ccff0-46eb-46dd-aa6b-a0069276275d\") " pod="metallb-system/frr-k8s-btwhk" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.470626 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ca5ccff0-46eb-46dd-aa6b-a0069276275d-metrics\") pod \"frr-k8s-btwhk\" (UID: \"ca5ccff0-46eb-46dd-aa6b-a0069276275d\") " pod="metallb-system/frr-k8s-btwhk" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.470684 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ca5ccff0-46eb-46dd-aa6b-a0069276275d-frr-conf\") pod \"frr-k8s-btwhk\" (UID: \"ca5ccff0-46eb-46dd-aa6b-a0069276275d\") " pod="metallb-system/frr-k8s-btwhk" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.470891 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ca5ccff0-46eb-46dd-aa6b-a0069276275d-reloader\") pod \"frr-k8s-btwhk\" (UID: \"ca5ccff0-46eb-46dd-aa6b-a0069276275d\") " pod="metallb-system/frr-k8s-btwhk" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.470924 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ca5ccff0-46eb-46dd-aa6b-a0069276275d-frr-startup\") pod \"frr-k8s-btwhk\" (UID: \"ca5ccff0-46eb-46dd-aa6b-a0069276275d\") " pod="metallb-system/frr-k8s-btwhk" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.498404 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca5ccff0-46eb-46dd-aa6b-a0069276275d-metrics-certs\") pod \"frr-k8s-btwhk\" (UID: \"ca5ccff0-46eb-46dd-aa6b-a0069276275d\") " pod="metallb-system/frr-k8s-btwhk" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.503975 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtft2\" (UniqueName: \"kubernetes.io/projected/14b6cfe2-8222-45da-808e-2a3d64d13b94-kube-api-access-vtft2\") pod \"frr-k8s-webhook-server-7fcb986d4-5rvzc\" (UID: \"14b6cfe2-8222-45da-808e-2a3d64d13b94\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5rvzc" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.504503 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z42ss\" (UniqueName: \"kubernetes.io/projected/ca5ccff0-46eb-46dd-aa6b-a0069276275d-kube-api-access-z42ss\") pod \"frr-k8s-btwhk\" (UID: \"ca5ccff0-46eb-46dd-aa6b-a0069276275d\") " pod="metallb-system/frr-k8s-btwhk" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.571628 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4c77b924-7d9d-48b6-9e00-476f7df7104c-metallb-excludel2\") pod \"speaker-zjgml\" (UID: \"4c77b924-7d9d-48b6-9e00-476f7df7104c\") " pod="metallb-system/speaker-zjgml" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.572342 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxqqs\" (UniqueName: \"kubernetes.io/projected/4c77b924-7d9d-48b6-9e00-476f7df7104c-kube-api-access-kxqqs\") pod \"speaker-zjgml\" (UID: \"4c77b924-7d9d-48b6-9e00-476f7df7104c\") " pod="metallb-system/speaker-zjgml" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.572398 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36c33153-2c15-48db-9ab8-a52854a85093-metrics-certs\") pod \"controller-f8648f98b-x5zbd\" (UID: \"36c33153-2c15-48db-9ab8-a52854a85093\") " pod="metallb-system/controller-f8648f98b-x5zbd" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.572435 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgrmm\" (UniqueName: \"kubernetes.io/projected/36c33153-2c15-48db-9ab8-a52854a85093-kube-api-access-dgrmm\") pod \"controller-f8648f98b-x5zbd\" (UID: \"36c33153-2c15-48db-9ab8-a52854a85093\") " pod="metallb-system/controller-f8648f98b-x5zbd" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.572456 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c77b924-7d9d-48b6-9e00-476f7df7104c-metrics-certs\") pod \"speaker-zjgml\" (UID: \"4c77b924-7d9d-48b6-9e00-476f7df7104c\") " pod="metallb-system/speaker-zjgml" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.572666 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/36c33153-2c15-48db-9ab8-a52854a85093-cert\") pod \"controller-f8648f98b-x5zbd\" (UID: \"36c33153-2c15-48db-9ab8-a52854a85093\") " pod="metallb-system/controller-f8648f98b-x5zbd" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.572721 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4c77b924-7d9d-48b6-9e00-476f7df7104c-memberlist\") pod \"speaker-zjgml\" (UID: \"4c77b924-7d9d-48b6-9e00-476f7df7104c\") " pod="metallb-system/speaker-zjgml" Dec 01 20:12:33 crc kubenswrapper[4802]: E1201 20:12:33.572903 4802 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 01 20:12:33 crc kubenswrapper[4802]: E1201 20:12:33.572979 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c77b924-7d9d-48b6-9e00-476f7df7104c-memberlist podName:4c77b924-7d9d-48b6-9e00-476f7df7104c nodeName:}" failed. No retries permitted until 2025-12-01 20:12:34.072957243 +0000 UTC m=+975.635516884 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/4c77b924-7d9d-48b6-9e00-476f7df7104c-memberlist") pod "speaker-zjgml" (UID: "4c77b924-7d9d-48b6-9e00-476f7df7104c") : secret "metallb-memberlist" not found Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.573135 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4c77b924-7d9d-48b6-9e00-476f7df7104c-metallb-excludel2\") pod \"speaker-zjgml\" (UID: \"4c77b924-7d9d-48b6-9e00-476f7df7104c\") " pod="metallb-system/speaker-zjgml" Dec 01 20:12:33 crc kubenswrapper[4802]: E1201 20:12:33.573262 4802 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Dec 01 20:12:33 crc kubenswrapper[4802]: E1201 20:12:33.573353 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c77b924-7d9d-48b6-9e00-476f7df7104c-metrics-certs podName:4c77b924-7d9d-48b6-9e00-476f7df7104c nodeName:}" failed. No retries permitted until 2025-12-01 20:12:34.073326794 +0000 UTC m=+975.635886435 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4c77b924-7d9d-48b6-9e00-476f7df7104c-metrics-certs") pod "speaker-zjgml" (UID: "4c77b924-7d9d-48b6-9e00-476f7df7104c") : secret "speaker-certs-secret" not found Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.589371 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-btwhk" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.608310 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxqqs\" (UniqueName: \"kubernetes.io/projected/4c77b924-7d9d-48b6-9e00-476f7df7104c-kube-api-access-kxqqs\") pod \"speaker-zjgml\" (UID: \"4c77b924-7d9d-48b6-9e00-476f7df7104c\") " pod="metallb-system/speaker-zjgml" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.674348 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36c33153-2c15-48db-9ab8-a52854a85093-metrics-certs\") pod \"controller-f8648f98b-x5zbd\" (UID: \"36c33153-2c15-48db-9ab8-a52854a85093\") " pod="metallb-system/controller-f8648f98b-x5zbd" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.674438 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgrmm\" (UniqueName: \"kubernetes.io/projected/36c33153-2c15-48db-9ab8-a52854a85093-kube-api-access-dgrmm\") pod \"controller-f8648f98b-x5zbd\" (UID: \"36c33153-2c15-48db-9ab8-a52854a85093\") " pod="metallb-system/controller-f8648f98b-x5zbd" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.674480 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/36c33153-2c15-48db-9ab8-a52854a85093-cert\") pod \"controller-f8648f98b-x5zbd\" (UID: \"36c33153-2c15-48db-9ab8-a52854a85093\") " pod="metallb-system/controller-f8648f98b-x5zbd" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.677003 4802 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.680167 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36c33153-2c15-48db-9ab8-a52854a85093-metrics-certs\") pod \"controller-f8648f98b-x5zbd\" (UID: \"36c33153-2c15-48db-9ab8-a52854a85093\") " pod="metallb-system/controller-f8648f98b-x5zbd" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.687942 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/36c33153-2c15-48db-9ab8-a52854a85093-cert\") pod \"controller-f8648f98b-x5zbd\" (UID: \"36c33153-2c15-48db-9ab8-a52854a85093\") " pod="metallb-system/controller-f8648f98b-x5zbd" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.699229 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgrmm\" (UniqueName: \"kubernetes.io/projected/36c33153-2c15-48db-9ab8-a52854a85093-kube-api-access-dgrmm\") pod \"controller-f8648f98b-x5zbd\" (UID: \"36c33153-2c15-48db-9ab8-a52854a85093\") " pod="metallb-system/controller-f8648f98b-x5zbd" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.753612 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-x5zbd" Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.973970 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-btwhk" event={"ID":"ca5ccff0-46eb-46dd-aa6b-a0069276275d","Type":"ContainerStarted","Data":"e10bfa6216b2fdc87010ae57f0db0aecfe19116ee15cf8df2149909f49016115"} Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.976178 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-x5zbd"] Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.979166 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/14b6cfe2-8222-45da-808e-2a3d64d13b94-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-5rvzc\" (UID: \"14b6cfe2-8222-45da-808e-2a3d64d13b94\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5rvzc" Dec 01 20:12:33 crc kubenswrapper[4802]: W1201 20:12:33.983697 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36c33153_2c15_48db_9ab8_a52854a85093.slice/crio-fb6dd9bf7a144fa375b4fa88ee19cb7f9172cc136cdd8a21480862fcee954e62 WatchSource:0}: Error finding container fb6dd9bf7a144fa375b4fa88ee19cb7f9172cc136cdd8a21480862fcee954e62: Status 404 returned error can't find the container with id fb6dd9bf7a144fa375b4fa88ee19cb7f9172cc136cdd8a21480862fcee954e62 Dec 01 20:12:33 crc kubenswrapper[4802]: I1201 20:12:33.986082 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/14b6cfe2-8222-45da-808e-2a3d64d13b94-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-5rvzc\" (UID: \"14b6cfe2-8222-45da-808e-2a3d64d13b94\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5rvzc" Dec 01 20:12:34 crc kubenswrapper[4802]: I1201 20:12:34.081306 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c77b924-7d9d-48b6-9e00-476f7df7104c-metrics-certs\") pod \"speaker-zjgml\" (UID: \"4c77b924-7d9d-48b6-9e00-476f7df7104c\") " pod="metallb-system/speaker-zjgml" Dec 01 20:12:34 crc kubenswrapper[4802]: I1201 20:12:34.081392 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4c77b924-7d9d-48b6-9e00-476f7df7104c-memberlist\") pod \"speaker-zjgml\" (UID: \"4c77b924-7d9d-48b6-9e00-476f7df7104c\") " pod="metallb-system/speaker-zjgml" Dec 01 20:12:34 crc kubenswrapper[4802]: E1201 20:12:34.081568 4802 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 01 20:12:34 crc kubenswrapper[4802]: E1201 20:12:34.081647 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c77b924-7d9d-48b6-9e00-476f7df7104c-memberlist podName:4c77b924-7d9d-48b6-9e00-476f7df7104c nodeName:}" failed. No retries permitted until 2025-12-01 20:12:35.081622573 +0000 UTC m=+976.644182214 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/4c77b924-7d9d-48b6-9e00-476f7df7104c-memberlist") pod "speaker-zjgml" (UID: "4c77b924-7d9d-48b6-9e00-476f7df7104c") : secret "metallb-memberlist" not found Dec 01 20:12:34 crc kubenswrapper[4802]: I1201 20:12:34.085916 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c77b924-7d9d-48b6-9e00-476f7df7104c-metrics-certs\") pod \"speaker-zjgml\" (UID: \"4c77b924-7d9d-48b6-9e00-476f7df7104c\") " pod="metallb-system/speaker-zjgml" Dec 01 20:12:34 crc kubenswrapper[4802]: I1201 20:12:34.179912 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5rvzc" Dec 01 20:12:34 crc kubenswrapper[4802]: I1201 20:12:34.426384 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-5rvzc"] Dec 01 20:12:34 crc kubenswrapper[4802]: W1201 20:12:34.434339 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14b6cfe2_8222_45da_808e_2a3d64d13b94.slice/crio-f7dd07dc82fb1dd0ae760aec3e95b783c7dafc28cc85a11ad0aa302d4c904c30 WatchSource:0}: Error finding container f7dd07dc82fb1dd0ae760aec3e95b783c7dafc28cc85a11ad0aa302d4c904c30: Status 404 returned error can't find the container with id f7dd07dc82fb1dd0ae760aec3e95b783c7dafc28cc85a11ad0aa302d4c904c30 Dec 01 20:12:34 crc kubenswrapper[4802]: I1201 20:12:34.982492 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5rvzc" event={"ID":"14b6cfe2-8222-45da-808e-2a3d64d13b94","Type":"ContainerStarted","Data":"f7dd07dc82fb1dd0ae760aec3e95b783c7dafc28cc85a11ad0aa302d4c904c30"} Dec 01 20:12:34 crc kubenswrapper[4802]: I1201 20:12:34.985095 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-x5zbd" event={"ID":"36c33153-2c15-48db-9ab8-a52854a85093","Type":"ContainerStarted","Data":"d64f3efc973b6c15755cf543090d69037f5c5b53cd35edafdb2fcdf35e56e3e2"} Dec 01 20:12:34 crc kubenswrapper[4802]: I1201 20:12:34.985164 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-x5zbd" event={"ID":"36c33153-2c15-48db-9ab8-a52854a85093","Type":"ContainerStarted","Data":"479b804f7f22a55bf1a2f23387f4ca742deecf9bf13d83fddc0917dcb4b21d09"} Dec 01 20:12:34 crc kubenswrapper[4802]: I1201 20:12:34.985178 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-x5zbd" event={"ID":"36c33153-2c15-48db-9ab8-a52854a85093","Type":"ContainerStarted","Data":"fb6dd9bf7a144fa375b4fa88ee19cb7f9172cc136cdd8a21480862fcee954e62"} Dec 01 20:12:35 crc kubenswrapper[4802]: I1201 20:12:35.003819 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-x5zbd" podStartSLOduration=2.003793858 podStartE2EDuration="2.003793858s" podCreationTimestamp="2025-12-01 20:12:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:12:35.002732345 +0000 UTC m=+976.565291986" watchObservedRunningTime="2025-12-01 20:12:35.003793858 +0000 UTC m=+976.566353509" Dec 01 20:12:35 crc kubenswrapper[4802]: I1201 20:12:35.099728 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4c77b924-7d9d-48b6-9e00-476f7df7104c-memberlist\") pod \"speaker-zjgml\" (UID: \"4c77b924-7d9d-48b6-9e00-476f7df7104c\") " pod="metallb-system/speaker-zjgml" Dec 01 20:12:35 crc kubenswrapper[4802]: I1201 20:12:35.108658 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4c77b924-7d9d-48b6-9e00-476f7df7104c-memberlist\") pod \"speaker-zjgml\" (UID: \"4c77b924-7d9d-48b6-9e00-476f7df7104c\") " pod="metallb-system/speaker-zjgml" Dec 01 20:12:35 crc kubenswrapper[4802]: I1201 20:12:35.205810 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zjgml" Dec 01 20:12:35 crc kubenswrapper[4802]: W1201 20:12:35.227891 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c77b924_7d9d_48b6_9e00_476f7df7104c.slice/crio-5f9488ddc9247c337cc18a709569e3bdee2c549b7e3a7ddd274ddfcd66ccebed WatchSource:0}: Error finding container 5f9488ddc9247c337cc18a709569e3bdee2c549b7e3a7ddd274ddfcd66ccebed: Status 404 returned error can't find the container with id 5f9488ddc9247c337cc18a709569e3bdee2c549b7e3a7ddd274ddfcd66ccebed Dec 01 20:12:36 crc kubenswrapper[4802]: I1201 20:12:36.005985 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zjgml" event={"ID":"4c77b924-7d9d-48b6-9e00-476f7df7104c","Type":"ContainerStarted","Data":"1b41b107eb567642d403675810838f7d1cc245cdc639a28cfd403a06050302a1"} Dec 01 20:12:36 crc kubenswrapper[4802]: I1201 20:12:36.006627 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zjgml" event={"ID":"4c77b924-7d9d-48b6-9e00-476f7df7104c","Type":"ContainerStarted","Data":"74b1cd7020d24e27c49c96382337a987e26ed9d98acaa2b0fdef674ceb41343d"} Dec 01 20:12:36 crc kubenswrapper[4802]: I1201 20:12:36.006650 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zjgml" event={"ID":"4c77b924-7d9d-48b6-9e00-476f7df7104c","Type":"ContainerStarted","Data":"5f9488ddc9247c337cc18a709569e3bdee2c549b7e3a7ddd274ddfcd66ccebed"} Dec 01 20:12:36 crc kubenswrapper[4802]: I1201 20:12:36.006812 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-x5zbd" Dec 01 20:12:36 crc kubenswrapper[4802]: I1201 20:12:36.050244 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-zjgml" podStartSLOduration=3.050217454 podStartE2EDuration="3.050217454s" podCreationTimestamp="2025-12-01 20:12:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:12:36.043517703 +0000 UTC m=+977.606077364" watchObservedRunningTime="2025-12-01 20:12:36.050217454 +0000 UTC m=+977.612777175" Dec 01 20:12:44 crc kubenswrapper[4802]: I1201 20:12:44.066830 4802 generic.go:334] "Generic (PLEG): container finished" podID="ca5ccff0-46eb-46dd-aa6b-a0069276275d" containerID="3231c895360aaecac38480fe7ec833b1e3fe2970f9a1472c77b754026eb971d1" exitCode=0 Dec 01 20:12:44 crc kubenswrapper[4802]: I1201 20:12:44.066934 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-btwhk" event={"ID":"ca5ccff0-46eb-46dd-aa6b-a0069276275d","Type":"ContainerDied","Data":"3231c895360aaecac38480fe7ec833b1e3fe2970f9a1472c77b754026eb971d1"} Dec 01 20:12:44 crc kubenswrapper[4802]: I1201 20:12:44.072111 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5rvzc" event={"ID":"14b6cfe2-8222-45da-808e-2a3d64d13b94","Type":"ContainerStarted","Data":"148e4c15d53c571e6970f0d1b2a4aed8d8f9b8bcf8a33954a624581ed3ad9f77"} Dec 01 20:12:44 crc kubenswrapper[4802]: I1201 20:12:44.072367 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5rvzc" Dec 01 20:12:44 crc kubenswrapper[4802]: I1201 20:12:44.114870 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5rvzc" podStartSLOduration=2.252490075 podStartE2EDuration="11.114839537s" podCreationTimestamp="2025-12-01 20:12:33 +0000 UTC" firstStartedPulling="2025-12-01 20:12:34.439008104 +0000 UTC m=+976.001567745" lastFinishedPulling="2025-12-01 20:12:43.301357566 +0000 UTC m=+984.863917207" observedRunningTime="2025-12-01 20:12:44.112408461 +0000 UTC m=+985.674968102" watchObservedRunningTime="2025-12-01 20:12:44.114839537 +0000 UTC m=+985.677399178" Dec 01 20:12:45 crc kubenswrapper[4802]: I1201 20:12:45.082667 4802 generic.go:334] "Generic (PLEG): container finished" podID="ca5ccff0-46eb-46dd-aa6b-a0069276275d" containerID="77f4a8f3ddccc5d367462ee79f9147a9a73578f6bbc5cc618f18f2e8c5e3343a" exitCode=0 Dec 01 20:12:45 crc kubenswrapper[4802]: I1201 20:12:45.082774 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-btwhk" event={"ID":"ca5ccff0-46eb-46dd-aa6b-a0069276275d","Type":"ContainerDied","Data":"77f4a8f3ddccc5d367462ee79f9147a9a73578f6bbc5cc618f18f2e8c5e3343a"} Dec 01 20:12:45 crc kubenswrapper[4802]: I1201 20:12:45.207624 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-zjgml" Dec 01 20:12:45 crc kubenswrapper[4802]: I1201 20:12:45.211509 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-zjgml" Dec 01 20:12:46 crc kubenswrapper[4802]: I1201 20:12:46.099009 4802 generic.go:334] "Generic (PLEG): container finished" podID="ca5ccff0-46eb-46dd-aa6b-a0069276275d" containerID="88d6d1db67d9c81f0d7d6dbf6abb3930320c594eaa49b477ed8c92173a4fd67e" exitCode=0 Dec 01 20:12:46 crc kubenswrapper[4802]: I1201 20:12:46.099653 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-btwhk" event={"ID":"ca5ccff0-46eb-46dd-aa6b-a0069276275d","Type":"ContainerDied","Data":"88d6d1db67d9c81f0d7d6dbf6abb3930320c594eaa49b477ed8c92173a4fd67e"} Dec 01 20:12:47 crc kubenswrapper[4802]: I1201 20:12:47.125180 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-btwhk" event={"ID":"ca5ccff0-46eb-46dd-aa6b-a0069276275d","Type":"ContainerStarted","Data":"d8cfb8c240f0a8e878ef93fdbbedab4dc140ab2ff43203a5ad0c8091dd6e0097"} Dec 01 20:12:47 crc kubenswrapper[4802]: I1201 20:12:47.125725 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-btwhk" event={"ID":"ca5ccff0-46eb-46dd-aa6b-a0069276275d","Type":"ContainerStarted","Data":"86738a67d4d567ba05d0b321d03c60d6a3461617071bd02639a1f90ec4ac876b"} Dec 01 20:12:47 crc kubenswrapper[4802]: I1201 20:12:47.125756 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-btwhk" event={"ID":"ca5ccff0-46eb-46dd-aa6b-a0069276275d","Type":"ContainerStarted","Data":"bb2c62fd288e85de8b0bd0bfec93035d75e901feda28715af8d9a0291e3af9e2"} Dec 01 20:12:47 crc kubenswrapper[4802]: I1201 20:12:47.125771 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-btwhk" event={"ID":"ca5ccff0-46eb-46dd-aa6b-a0069276275d","Type":"ContainerStarted","Data":"c9dcb4d74771b921c44e9be741180e436e93fe2a8ec7eb5dead04895480fa309"} Dec 01 20:12:47 crc kubenswrapper[4802]: I1201 20:12:47.125788 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-btwhk" event={"ID":"ca5ccff0-46eb-46dd-aa6b-a0069276275d","Type":"ContainerStarted","Data":"af5bcee4ba9b96f95f882a3b6d9492a9570185f9619df882b616c20e60d2edf6"} Dec 01 20:12:48 crc kubenswrapper[4802]: I1201 20:12:48.141390 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-btwhk" event={"ID":"ca5ccff0-46eb-46dd-aa6b-a0069276275d","Type":"ContainerStarted","Data":"d2d83e4191bab875b19929e799204afbc354320c838cbc6ec6ed6ea72105dff8"} Dec 01 20:12:48 crc kubenswrapper[4802]: I1201 20:12:48.141930 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-btwhk" Dec 01 20:12:48 crc kubenswrapper[4802]: I1201 20:12:48.192694 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-btwhk" podStartSLOduration=5.59893791 podStartE2EDuration="15.192654154s" podCreationTimestamp="2025-12-01 20:12:33 +0000 UTC" firstStartedPulling="2025-12-01 20:12:33.754062649 +0000 UTC m=+975.316622290" lastFinishedPulling="2025-12-01 20:12:43.347778883 +0000 UTC m=+984.910338534" observedRunningTime="2025-12-01 20:12:48.187086669 +0000 UTC m=+989.749646320" watchObservedRunningTime="2025-12-01 20:12:48.192654154 +0000 UTC m=+989.755213795" Dec 01 20:12:48 crc kubenswrapper[4802]: I1201 20:12:48.411888 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-5njzl"] Dec 01 20:12:48 crc kubenswrapper[4802]: I1201 20:12:48.413110 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5njzl" Dec 01 20:12:48 crc kubenswrapper[4802]: I1201 20:12:48.416402 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 01 20:12:48 crc kubenswrapper[4802]: I1201 20:12:48.416717 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-wx6tf" Dec 01 20:12:48 crc kubenswrapper[4802]: I1201 20:12:48.417137 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 01 20:12:48 crc kubenswrapper[4802]: I1201 20:12:48.428761 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5njzl"] Dec 01 20:12:48 crc kubenswrapper[4802]: I1201 20:12:48.558597 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdg8f\" (UniqueName: \"kubernetes.io/projected/3d12e9fb-6800-4681-a9a2-443895de0dfc-kube-api-access-fdg8f\") pod \"openstack-operator-index-5njzl\" (UID: \"3d12e9fb-6800-4681-a9a2-443895de0dfc\") " pod="openstack-operators/openstack-operator-index-5njzl" Dec 01 20:12:48 crc kubenswrapper[4802]: I1201 20:12:48.590655 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-btwhk" Dec 01 20:12:48 crc kubenswrapper[4802]: I1201 20:12:48.638640 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-btwhk" Dec 01 20:12:48 crc kubenswrapper[4802]: I1201 20:12:48.660062 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdg8f\" (UniqueName: \"kubernetes.io/projected/3d12e9fb-6800-4681-a9a2-443895de0dfc-kube-api-access-fdg8f\") pod \"openstack-operator-index-5njzl\" (UID: \"3d12e9fb-6800-4681-a9a2-443895de0dfc\") " pod="openstack-operators/openstack-operator-index-5njzl" Dec 01 20:12:48 crc kubenswrapper[4802]: I1201 20:12:48.687220 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdg8f\" (UniqueName: \"kubernetes.io/projected/3d12e9fb-6800-4681-a9a2-443895de0dfc-kube-api-access-fdg8f\") pod \"openstack-operator-index-5njzl\" (UID: \"3d12e9fb-6800-4681-a9a2-443895de0dfc\") " pod="openstack-operators/openstack-operator-index-5njzl" Dec 01 20:12:48 crc kubenswrapper[4802]: I1201 20:12:48.747105 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5njzl" Dec 01 20:12:49 crc kubenswrapper[4802]: I1201 20:12:49.211578 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5njzl"] Dec 01 20:12:50 crc kubenswrapper[4802]: I1201 20:12:50.159699 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5njzl" event={"ID":"3d12e9fb-6800-4681-a9a2-443895de0dfc","Type":"ContainerStarted","Data":"698af94c20b089e35a7b098861ff8d4657282979085e90f970b4e90d134b5d7c"} Dec 01 20:12:51 crc kubenswrapper[4802]: I1201 20:12:51.790937 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-5njzl"] Dec 01 20:12:52 crc kubenswrapper[4802]: I1201 20:12:52.474521 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-t7nmz"] Dec 01 20:12:52 crc kubenswrapper[4802]: I1201 20:12:52.476290 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-t7nmz" Dec 01 20:12:52 crc kubenswrapper[4802]: I1201 20:12:52.486139 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-t7nmz"] Dec 01 20:12:52 crc kubenswrapper[4802]: I1201 20:12:52.639555 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-645hx\" (UniqueName: \"kubernetes.io/projected/47078ba5-f704-4077-93af-c0afffa2070f-kube-api-access-645hx\") pod \"openstack-operator-index-t7nmz\" (UID: \"47078ba5-f704-4077-93af-c0afffa2070f\") " pod="openstack-operators/openstack-operator-index-t7nmz" Dec 01 20:12:52 crc kubenswrapper[4802]: I1201 20:12:52.741084 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-645hx\" (UniqueName: \"kubernetes.io/projected/47078ba5-f704-4077-93af-c0afffa2070f-kube-api-access-645hx\") pod \"openstack-operator-index-t7nmz\" (UID: \"47078ba5-f704-4077-93af-c0afffa2070f\") " pod="openstack-operators/openstack-operator-index-t7nmz" Dec 01 20:12:52 crc kubenswrapper[4802]: I1201 20:12:52.783345 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-645hx\" (UniqueName: \"kubernetes.io/projected/47078ba5-f704-4077-93af-c0afffa2070f-kube-api-access-645hx\") pod \"openstack-operator-index-t7nmz\" (UID: \"47078ba5-f704-4077-93af-c0afffa2070f\") " pod="openstack-operators/openstack-operator-index-t7nmz" Dec 01 20:12:52 crc kubenswrapper[4802]: I1201 20:12:52.813014 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-t7nmz" Dec 01 20:12:53 crc kubenswrapper[4802]: I1201 20:12:53.478965 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-t7nmz"] Dec 01 20:12:53 crc kubenswrapper[4802]: I1201 20:12:53.760210 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-x5zbd" Dec 01 20:12:54 crc kubenswrapper[4802]: I1201 20:12:54.185234 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5rvzc" Dec 01 20:12:54 crc kubenswrapper[4802]: I1201 20:12:54.196042 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-t7nmz" event={"ID":"47078ba5-f704-4077-93af-c0afffa2070f","Type":"ContainerStarted","Data":"257ecd6cbb1f103e263dcc01648c9f62bf5f39cedaf728453af8b1a69358e149"} Dec 01 20:12:55 crc kubenswrapper[4802]: I1201 20:12:55.205995 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-t7nmz" event={"ID":"47078ba5-f704-4077-93af-c0afffa2070f","Type":"ContainerStarted","Data":"651806a3f79a45c7ddb206c8472ea8cd71f1074bf9a2b7b948b572d0d087ee69"} Dec 01 20:12:55 crc kubenswrapper[4802]: I1201 20:12:55.209512 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5njzl" event={"ID":"3d12e9fb-6800-4681-a9a2-443895de0dfc","Type":"ContainerStarted","Data":"0f46431beaef4a0e115ae64331f6b41cdef34fd06374715cf0519c1cf87e2bab"} Dec 01 20:12:55 crc kubenswrapper[4802]: I1201 20:12:55.209739 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-5njzl" podUID="3d12e9fb-6800-4681-a9a2-443895de0dfc" containerName="registry-server" containerID="cri-o://0f46431beaef4a0e115ae64331f6b41cdef34fd06374715cf0519c1cf87e2bab" gracePeriod=2 Dec 01 20:12:55 crc kubenswrapper[4802]: I1201 20:12:55.235666 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-t7nmz" podStartSLOduration=2.063746164 podStartE2EDuration="3.235634209s" podCreationTimestamp="2025-12-01 20:12:52 +0000 UTC" firstStartedPulling="2025-12-01 20:12:53.53843023 +0000 UTC m=+995.100989871" lastFinishedPulling="2025-12-01 20:12:54.710318275 +0000 UTC m=+996.272877916" observedRunningTime="2025-12-01 20:12:55.225410207 +0000 UTC m=+996.787969888" watchObservedRunningTime="2025-12-01 20:12:55.235634209 +0000 UTC m=+996.798193850" Dec 01 20:12:55 crc kubenswrapper[4802]: I1201 20:12:55.244659 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-5njzl" podStartSLOduration=1.755280037 podStartE2EDuration="7.244632642s" podCreationTimestamp="2025-12-01 20:12:48 +0000 UTC" firstStartedPulling="2025-12-01 20:12:49.219089401 +0000 UTC m=+990.781649052" lastFinishedPulling="2025-12-01 20:12:54.708442016 +0000 UTC m=+996.271001657" observedRunningTime="2025-12-01 20:12:55.244043523 +0000 UTC m=+996.806603164" watchObservedRunningTime="2025-12-01 20:12:55.244632642 +0000 UTC m=+996.807192293" Dec 01 20:12:55 crc kubenswrapper[4802]: I1201 20:12:55.643125 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5njzl" Dec 01 20:12:55 crc kubenswrapper[4802]: I1201 20:12:55.796070 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdg8f\" (UniqueName: \"kubernetes.io/projected/3d12e9fb-6800-4681-a9a2-443895de0dfc-kube-api-access-fdg8f\") pod \"3d12e9fb-6800-4681-a9a2-443895de0dfc\" (UID: \"3d12e9fb-6800-4681-a9a2-443895de0dfc\") " Dec 01 20:12:55 crc kubenswrapper[4802]: I1201 20:12:55.803700 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d12e9fb-6800-4681-a9a2-443895de0dfc-kube-api-access-fdg8f" (OuterVolumeSpecName: "kube-api-access-fdg8f") pod "3d12e9fb-6800-4681-a9a2-443895de0dfc" (UID: "3d12e9fb-6800-4681-a9a2-443895de0dfc"). InnerVolumeSpecName "kube-api-access-fdg8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:12:55 crc kubenswrapper[4802]: I1201 20:12:55.898184 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdg8f\" (UniqueName: \"kubernetes.io/projected/3d12e9fb-6800-4681-a9a2-443895de0dfc-kube-api-access-fdg8f\") on node \"crc\" DevicePath \"\"" Dec 01 20:12:56 crc kubenswrapper[4802]: I1201 20:12:56.220115 4802 generic.go:334] "Generic (PLEG): container finished" podID="3d12e9fb-6800-4681-a9a2-443895de0dfc" containerID="0f46431beaef4a0e115ae64331f6b41cdef34fd06374715cf0519c1cf87e2bab" exitCode=0 Dec 01 20:12:56 crc kubenswrapper[4802]: I1201 20:12:56.220326 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5njzl" Dec 01 20:12:56 crc kubenswrapper[4802]: I1201 20:12:56.220309 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5njzl" event={"ID":"3d12e9fb-6800-4681-a9a2-443895de0dfc","Type":"ContainerDied","Data":"0f46431beaef4a0e115ae64331f6b41cdef34fd06374715cf0519c1cf87e2bab"} Dec 01 20:12:56 crc kubenswrapper[4802]: I1201 20:12:56.221088 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5njzl" event={"ID":"3d12e9fb-6800-4681-a9a2-443895de0dfc","Type":"ContainerDied","Data":"698af94c20b089e35a7b098861ff8d4657282979085e90f970b4e90d134b5d7c"} Dec 01 20:12:56 crc kubenswrapper[4802]: I1201 20:12:56.221146 4802 scope.go:117] "RemoveContainer" containerID="0f46431beaef4a0e115ae64331f6b41cdef34fd06374715cf0519c1cf87e2bab" Dec 01 20:12:56 crc kubenswrapper[4802]: I1201 20:12:56.248298 4802 scope.go:117] "RemoveContainer" containerID="0f46431beaef4a0e115ae64331f6b41cdef34fd06374715cf0519c1cf87e2bab" Dec 01 20:12:56 crc kubenswrapper[4802]: E1201 20:12:56.249006 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f46431beaef4a0e115ae64331f6b41cdef34fd06374715cf0519c1cf87e2bab\": container with ID starting with 0f46431beaef4a0e115ae64331f6b41cdef34fd06374715cf0519c1cf87e2bab not found: ID does not exist" containerID="0f46431beaef4a0e115ae64331f6b41cdef34fd06374715cf0519c1cf87e2bab" Dec 01 20:12:56 crc kubenswrapper[4802]: I1201 20:12:56.249087 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f46431beaef4a0e115ae64331f6b41cdef34fd06374715cf0519c1cf87e2bab"} err="failed to get container status \"0f46431beaef4a0e115ae64331f6b41cdef34fd06374715cf0519c1cf87e2bab\": rpc error: code = NotFound desc = could not find container \"0f46431beaef4a0e115ae64331f6b41cdef34fd06374715cf0519c1cf87e2bab\": container with ID starting with 0f46431beaef4a0e115ae64331f6b41cdef34fd06374715cf0519c1cf87e2bab not found: ID does not exist" Dec 01 20:12:56 crc kubenswrapper[4802]: I1201 20:12:56.259735 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-5njzl"] Dec 01 20:12:56 crc kubenswrapper[4802]: I1201 20:12:56.266088 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-5njzl"] Dec 01 20:12:56 crc kubenswrapper[4802]: I1201 20:12:56.729771 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d12e9fb-6800-4681-a9a2-443895de0dfc" path="/var/lib/kubelet/pods/3d12e9fb-6800-4681-a9a2-443895de0dfc/volumes" Dec 01 20:13:02 crc kubenswrapper[4802]: I1201 20:13:02.813926 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-t7nmz" Dec 01 20:13:02 crc kubenswrapper[4802]: I1201 20:13:02.815266 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-t7nmz" Dec 01 20:13:02 crc kubenswrapper[4802]: I1201 20:13:02.862643 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-t7nmz" Dec 01 20:13:03 crc kubenswrapper[4802]: I1201 20:13:03.313885 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-t7nmz" Dec 01 20:13:03 crc kubenswrapper[4802]: I1201 20:13:03.593342 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-btwhk" Dec 01 20:13:08 crc kubenswrapper[4802]: I1201 20:13:08.866179 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w"] Dec 01 20:13:08 crc kubenswrapper[4802]: E1201 20:13:08.867377 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d12e9fb-6800-4681-a9a2-443895de0dfc" containerName="registry-server" Dec 01 20:13:08 crc kubenswrapper[4802]: I1201 20:13:08.867403 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d12e9fb-6800-4681-a9a2-443895de0dfc" containerName="registry-server" Dec 01 20:13:08 crc kubenswrapper[4802]: I1201 20:13:08.867644 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d12e9fb-6800-4681-a9a2-443895de0dfc" containerName="registry-server" Dec 01 20:13:08 crc kubenswrapper[4802]: I1201 20:13:08.869240 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w" Dec 01 20:13:08 crc kubenswrapper[4802]: I1201 20:13:08.872347 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-chmvq" Dec 01 20:13:08 crc kubenswrapper[4802]: I1201 20:13:08.882591 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w"] Dec 01 20:13:09 crc kubenswrapper[4802]: I1201 20:13:09.030275 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4928d2b-4f1e-4d3c-a858-8180343a7405-bundle\") pod \"76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w\" (UID: \"d4928d2b-4f1e-4d3c-a858-8180343a7405\") " pod="openstack-operators/76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w" Dec 01 20:13:09 crc kubenswrapper[4802]: I1201 20:13:09.030878 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4928d2b-4f1e-4d3c-a858-8180343a7405-util\") pod \"76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w\" (UID: \"d4928d2b-4f1e-4d3c-a858-8180343a7405\") " pod="openstack-operators/76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w" Dec 01 20:13:09 crc kubenswrapper[4802]: I1201 20:13:09.031000 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lxzn\" (UniqueName: \"kubernetes.io/projected/d4928d2b-4f1e-4d3c-a858-8180343a7405-kube-api-access-4lxzn\") pod \"76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w\" (UID: \"d4928d2b-4f1e-4d3c-a858-8180343a7405\") " pod="openstack-operators/76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w" Dec 01 20:13:09 crc kubenswrapper[4802]: I1201 20:13:09.132975 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lxzn\" (UniqueName: \"kubernetes.io/projected/d4928d2b-4f1e-4d3c-a858-8180343a7405-kube-api-access-4lxzn\") pod \"76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w\" (UID: \"d4928d2b-4f1e-4d3c-a858-8180343a7405\") " pod="openstack-operators/76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w" Dec 01 20:13:09 crc kubenswrapper[4802]: I1201 20:13:09.133305 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4928d2b-4f1e-4d3c-a858-8180343a7405-bundle\") pod \"76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w\" (UID: \"d4928d2b-4f1e-4d3c-a858-8180343a7405\") " pod="openstack-operators/76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w" Dec 01 20:13:09 crc kubenswrapper[4802]: I1201 20:13:09.133356 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4928d2b-4f1e-4d3c-a858-8180343a7405-util\") pod \"76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w\" (UID: \"d4928d2b-4f1e-4d3c-a858-8180343a7405\") " pod="openstack-operators/76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w" Dec 01 20:13:09 crc kubenswrapper[4802]: I1201 20:13:09.134722 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4928d2b-4f1e-4d3c-a858-8180343a7405-util\") pod \"76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w\" (UID: \"d4928d2b-4f1e-4d3c-a858-8180343a7405\") " pod="openstack-operators/76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w" Dec 01 20:13:09 crc kubenswrapper[4802]: I1201 20:13:09.134727 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4928d2b-4f1e-4d3c-a858-8180343a7405-bundle\") pod \"76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w\" (UID: \"d4928d2b-4f1e-4d3c-a858-8180343a7405\") " pod="openstack-operators/76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w" Dec 01 20:13:09 crc kubenswrapper[4802]: I1201 20:13:09.173013 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lxzn\" (UniqueName: \"kubernetes.io/projected/d4928d2b-4f1e-4d3c-a858-8180343a7405-kube-api-access-4lxzn\") pod \"76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w\" (UID: \"d4928d2b-4f1e-4d3c-a858-8180343a7405\") " pod="openstack-operators/76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w" Dec 01 20:13:09 crc kubenswrapper[4802]: I1201 20:13:09.206961 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w" Dec 01 20:13:09 crc kubenswrapper[4802]: I1201 20:13:09.694359 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w"] Dec 01 20:13:09 crc kubenswrapper[4802]: W1201 20:13:09.701218 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4928d2b_4f1e_4d3c_a858_8180343a7405.slice/crio-7e17fe32f547b2988f6935af2b3b8262e570f3cdab6598b73cd2c89e75a7e319 WatchSource:0}: Error finding container 7e17fe32f547b2988f6935af2b3b8262e570f3cdab6598b73cd2c89e75a7e319: Status 404 returned error can't find the container with id 7e17fe32f547b2988f6935af2b3b8262e570f3cdab6598b73cd2c89e75a7e319 Dec 01 20:13:10 crc kubenswrapper[4802]: I1201 20:13:10.355164 4802 generic.go:334] "Generic (PLEG): container finished" podID="d4928d2b-4f1e-4d3c-a858-8180343a7405" containerID="9e18c0d2b49d7e548f352ca58019d2fd0e3766538d4048e4089132bf486d6550" exitCode=0 Dec 01 20:13:10 crc kubenswrapper[4802]: I1201 20:13:10.355264 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w" event={"ID":"d4928d2b-4f1e-4d3c-a858-8180343a7405","Type":"ContainerDied","Data":"9e18c0d2b49d7e548f352ca58019d2fd0e3766538d4048e4089132bf486d6550"} Dec 01 20:13:10 crc kubenswrapper[4802]: I1201 20:13:10.355356 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w" event={"ID":"d4928d2b-4f1e-4d3c-a858-8180343a7405","Type":"ContainerStarted","Data":"7e17fe32f547b2988f6935af2b3b8262e570f3cdab6598b73cd2c89e75a7e319"} Dec 01 20:13:11 crc kubenswrapper[4802]: I1201 20:13:11.368951 4802 generic.go:334] "Generic (PLEG): container finished" podID="d4928d2b-4f1e-4d3c-a858-8180343a7405" containerID="c1f0e71262b7ee05d324554d52f9a743d574b2f8e3a558b3571e5962d6feb8f3" exitCode=0 Dec 01 20:13:11 crc kubenswrapper[4802]: I1201 20:13:11.369044 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w" event={"ID":"d4928d2b-4f1e-4d3c-a858-8180343a7405","Type":"ContainerDied","Data":"c1f0e71262b7ee05d324554d52f9a743d574b2f8e3a558b3571e5962d6feb8f3"} Dec 01 20:13:12 crc kubenswrapper[4802]: I1201 20:13:12.381681 4802 generic.go:334] "Generic (PLEG): container finished" podID="d4928d2b-4f1e-4d3c-a858-8180343a7405" containerID="01bd944297c74186f53f4323eed1dcf88bb55edcef34a5cadd6cdb70e178a616" exitCode=0 Dec 01 20:13:12 crc kubenswrapper[4802]: I1201 20:13:12.381753 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w" event={"ID":"d4928d2b-4f1e-4d3c-a858-8180343a7405","Type":"ContainerDied","Data":"01bd944297c74186f53f4323eed1dcf88bb55edcef34a5cadd6cdb70e178a616"} Dec 01 20:13:13 crc kubenswrapper[4802]: I1201 20:13:13.753011 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w" Dec 01 20:13:13 crc kubenswrapper[4802]: I1201 20:13:13.921284 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lxzn\" (UniqueName: \"kubernetes.io/projected/d4928d2b-4f1e-4d3c-a858-8180343a7405-kube-api-access-4lxzn\") pod \"d4928d2b-4f1e-4d3c-a858-8180343a7405\" (UID: \"d4928d2b-4f1e-4d3c-a858-8180343a7405\") " Dec 01 20:13:13 crc kubenswrapper[4802]: I1201 20:13:13.921461 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4928d2b-4f1e-4d3c-a858-8180343a7405-bundle\") pod \"d4928d2b-4f1e-4d3c-a858-8180343a7405\" (UID: \"d4928d2b-4f1e-4d3c-a858-8180343a7405\") " Dec 01 20:13:13 crc kubenswrapper[4802]: I1201 20:13:13.921537 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4928d2b-4f1e-4d3c-a858-8180343a7405-util\") pod \"d4928d2b-4f1e-4d3c-a858-8180343a7405\" (UID: \"d4928d2b-4f1e-4d3c-a858-8180343a7405\") " Dec 01 20:13:13 crc kubenswrapper[4802]: I1201 20:13:13.923061 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4928d2b-4f1e-4d3c-a858-8180343a7405-bundle" (OuterVolumeSpecName: "bundle") pod "d4928d2b-4f1e-4d3c-a858-8180343a7405" (UID: "d4928d2b-4f1e-4d3c-a858-8180343a7405"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:13:13 crc kubenswrapper[4802]: I1201 20:13:13.930999 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4928d2b-4f1e-4d3c-a858-8180343a7405-kube-api-access-4lxzn" (OuterVolumeSpecName: "kube-api-access-4lxzn") pod "d4928d2b-4f1e-4d3c-a858-8180343a7405" (UID: "d4928d2b-4f1e-4d3c-a858-8180343a7405"). InnerVolumeSpecName "kube-api-access-4lxzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:13:13 crc kubenswrapper[4802]: I1201 20:13:13.945190 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4928d2b-4f1e-4d3c-a858-8180343a7405-util" (OuterVolumeSpecName: "util") pod "d4928d2b-4f1e-4d3c-a858-8180343a7405" (UID: "d4928d2b-4f1e-4d3c-a858-8180343a7405"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:13:14 crc kubenswrapper[4802]: I1201 20:13:14.024257 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lxzn\" (UniqueName: \"kubernetes.io/projected/d4928d2b-4f1e-4d3c-a858-8180343a7405-kube-api-access-4lxzn\") on node \"crc\" DevicePath \"\"" Dec 01 20:13:14 crc kubenswrapper[4802]: I1201 20:13:14.024328 4802 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4928d2b-4f1e-4d3c-a858-8180343a7405-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:13:14 crc kubenswrapper[4802]: I1201 20:13:14.024349 4802 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4928d2b-4f1e-4d3c-a858-8180343a7405-util\") on node \"crc\" DevicePath \"\"" Dec 01 20:13:14 crc kubenswrapper[4802]: I1201 20:13:14.404621 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w" event={"ID":"d4928d2b-4f1e-4d3c-a858-8180343a7405","Type":"ContainerDied","Data":"7e17fe32f547b2988f6935af2b3b8262e570f3cdab6598b73cd2c89e75a7e319"} Dec 01 20:13:14 crc kubenswrapper[4802]: I1201 20:13:14.404688 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e17fe32f547b2988f6935af2b3b8262e570f3cdab6598b73cd2c89e75a7e319" Dec 01 20:13:14 crc kubenswrapper[4802]: I1201 20:13:14.404734 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w" Dec 01 20:13:21 crc kubenswrapper[4802]: I1201 20:13:21.169950 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-849fbcc767-rv5gz"] Dec 01 20:13:21 crc kubenswrapper[4802]: E1201 20:13:21.173177 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4928d2b-4f1e-4d3c-a858-8180343a7405" containerName="extract" Dec 01 20:13:21 crc kubenswrapper[4802]: I1201 20:13:21.173355 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4928d2b-4f1e-4d3c-a858-8180343a7405" containerName="extract" Dec 01 20:13:21 crc kubenswrapper[4802]: E1201 20:13:21.173485 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4928d2b-4f1e-4d3c-a858-8180343a7405" containerName="util" Dec 01 20:13:21 crc kubenswrapper[4802]: I1201 20:13:21.173586 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4928d2b-4f1e-4d3c-a858-8180343a7405" containerName="util" Dec 01 20:13:21 crc kubenswrapper[4802]: E1201 20:13:21.173718 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4928d2b-4f1e-4d3c-a858-8180343a7405" containerName="pull" Dec 01 20:13:21 crc kubenswrapper[4802]: I1201 20:13:21.173802 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4928d2b-4f1e-4d3c-a858-8180343a7405" containerName="pull" Dec 01 20:13:21 crc kubenswrapper[4802]: I1201 20:13:21.174074 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4928d2b-4f1e-4d3c-a858-8180343a7405" containerName="extract" Dec 01 20:13:21 crc kubenswrapper[4802]: I1201 20:13:21.174970 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-849fbcc767-rv5gz" Dec 01 20:13:21 crc kubenswrapper[4802]: I1201 20:13:21.180295 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-ltf8q" Dec 01 20:13:21 crc kubenswrapper[4802]: I1201 20:13:21.209803 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-849fbcc767-rv5gz"] Dec 01 20:13:21 crc kubenswrapper[4802]: I1201 20:13:21.348461 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm47g\" (UniqueName: \"kubernetes.io/projected/b26cce34-e8fa-4145-a0dd-daa30dfdde81-kube-api-access-rm47g\") pod \"openstack-operator-controller-operator-849fbcc767-rv5gz\" (UID: \"b26cce34-e8fa-4145-a0dd-daa30dfdde81\") " pod="openstack-operators/openstack-operator-controller-operator-849fbcc767-rv5gz" Dec 01 20:13:21 crc kubenswrapper[4802]: I1201 20:13:21.450068 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm47g\" (UniqueName: \"kubernetes.io/projected/b26cce34-e8fa-4145-a0dd-daa30dfdde81-kube-api-access-rm47g\") pod \"openstack-operator-controller-operator-849fbcc767-rv5gz\" (UID: \"b26cce34-e8fa-4145-a0dd-daa30dfdde81\") " pod="openstack-operators/openstack-operator-controller-operator-849fbcc767-rv5gz" Dec 01 20:13:21 crc kubenswrapper[4802]: I1201 20:13:21.475539 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm47g\" (UniqueName: \"kubernetes.io/projected/b26cce34-e8fa-4145-a0dd-daa30dfdde81-kube-api-access-rm47g\") pod \"openstack-operator-controller-operator-849fbcc767-rv5gz\" (UID: \"b26cce34-e8fa-4145-a0dd-daa30dfdde81\") " pod="openstack-operators/openstack-operator-controller-operator-849fbcc767-rv5gz" Dec 01 20:13:21 crc kubenswrapper[4802]: I1201 20:13:21.494266 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-849fbcc767-rv5gz" Dec 01 20:13:21 crc kubenswrapper[4802]: I1201 20:13:21.810452 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-849fbcc767-rv5gz"] Dec 01 20:13:21 crc kubenswrapper[4802]: W1201 20:13:21.830580 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb26cce34_e8fa_4145_a0dd_daa30dfdde81.slice/crio-ce8c115ad992cbd0b5eb7d578f890742e3101bd0e61fde11030463619c47000c WatchSource:0}: Error finding container ce8c115ad992cbd0b5eb7d578f890742e3101bd0e61fde11030463619c47000c: Status 404 returned error can't find the container with id ce8c115ad992cbd0b5eb7d578f890742e3101bd0e61fde11030463619c47000c Dec 01 20:13:22 crc kubenswrapper[4802]: I1201 20:13:22.476227 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-849fbcc767-rv5gz" event={"ID":"b26cce34-e8fa-4145-a0dd-daa30dfdde81","Type":"ContainerStarted","Data":"ce8c115ad992cbd0b5eb7d578f890742e3101bd0e61fde11030463619c47000c"} Dec 01 20:13:27 crc kubenswrapper[4802]: I1201 20:13:27.523739 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-849fbcc767-rv5gz" event={"ID":"b26cce34-e8fa-4145-a0dd-daa30dfdde81","Type":"ContainerStarted","Data":"b2687acac4567f934d56faf9a51e56155feccfbb9bcfc897dc4c14acc6f6606e"} Dec 01 20:13:27 crc kubenswrapper[4802]: I1201 20:13:27.524744 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-849fbcc767-rv5gz" Dec 01 20:13:27 crc kubenswrapper[4802]: I1201 20:13:27.570510 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-849fbcc767-rv5gz" podStartSLOduration=1.4341735340000001 podStartE2EDuration="6.570481675s" podCreationTimestamp="2025-12-01 20:13:21 +0000 UTC" firstStartedPulling="2025-12-01 20:13:21.832602596 +0000 UTC m=+1023.395162237" lastFinishedPulling="2025-12-01 20:13:26.968910737 +0000 UTC m=+1028.531470378" observedRunningTime="2025-12-01 20:13:27.56427391 +0000 UTC m=+1029.126833581" watchObservedRunningTime="2025-12-01 20:13:27.570481675 +0000 UTC m=+1029.133041336" Dec 01 20:13:41 crc kubenswrapper[4802]: I1201 20:13:41.499568 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-849fbcc767-rv5gz" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.260560 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-n7mjb"] Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.262745 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-n7mjb" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.266246 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-t9d5s" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.272929 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-6d8qg"] Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.277087 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-6d8qg" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.286491 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-wvz8t" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.338690 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-qbvvz"] Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.340696 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-qbvvz" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.374372 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-n7mjb"] Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.380544 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-6d8qg"] Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.382231 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-gqgwc" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.423289 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-qbvvz"] Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.443643 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcdnt\" (UniqueName: \"kubernetes.io/projected/b01ea1d5-0409-4c32-bb34-1b88253ceb05-kube-api-access-tcdnt\") pod \"designate-operator-controller-manager-78b4bc895b-qbvvz\" (UID: \"b01ea1d5-0409-4c32-bb34-1b88253ceb05\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-qbvvz" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.443892 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dswq\" (UniqueName: \"kubernetes.io/projected/e067c10a-e5d4-4e57-bf14-3b0bfc8ac069-kube-api-access-5dswq\") pod \"cinder-operator-controller-manager-859b6ccc6-n7mjb\" (UID: \"e067c10a-e5d4-4e57-bf14-3b0bfc8ac069\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-n7mjb" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.444077 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htdws\" (UniqueName: \"kubernetes.io/projected/18526d53-2d4c-4c40-885c-c83b3b378260-kube-api-access-htdws\") pod \"barbican-operator-controller-manager-7d9dfd778-6d8qg\" (UID: \"18526d53-2d4c-4c40-885c-c83b3b378260\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-6d8qg" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.446644 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-776c976b46-x7bkb"] Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.448275 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-776c976b46-x7bkb" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.452310 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5ftkv"] Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.452796 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-hsjj9" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.454076 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5ftkv" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.462640 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-776c976b46-x7bkb"] Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.463564 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-t5wm4" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.477347 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kmtdj"] Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.478916 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kmtdj" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.481459 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-wpb2h" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.485183 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5ftkv"] Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.502824 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-jbfz7"] Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.504340 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-jbfz7" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.509796 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-7cwbk" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.510161 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.512296 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kmtdj"] Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.521523 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-jbfz7"] Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.540679 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-jcmlp"] Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.542338 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-jcmlp" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.545271 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htdws\" (UniqueName: \"kubernetes.io/projected/18526d53-2d4c-4c40-885c-c83b3b378260-kube-api-access-htdws\") pod \"barbican-operator-controller-manager-7d9dfd778-6d8qg\" (UID: \"18526d53-2d4c-4c40-885c-c83b3b378260\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-6d8qg" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.545320 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7ckt\" (UniqueName: \"kubernetes.io/projected/8626cbeb-8604-4371-b936-99cab8d76742-kube-api-access-s7ckt\") pod \"heat-operator-controller-manager-5f64f6f8bb-5ftkv\" (UID: \"8626cbeb-8604-4371-b936-99cab8d76742\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5ftkv" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.545372 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcdnt\" (UniqueName: \"kubernetes.io/projected/b01ea1d5-0409-4c32-bb34-1b88253ceb05-kube-api-access-tcdnt\") pod \"designate-operator-controller-manager-78b4bc895b-qbvvz\" (UID: \"b01ea1d5-0409-4c32-bb34-1b88253ceb05\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-qbvvz" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.545407 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45fnk\" (UniqueName: \"kubernetes.io/projected/eb553ce8-f696-4c6b-a745-aa1faa5f9356-kube-api-access-45fnk\") pod \"glance-operator-controller-manager-776c976b46-x7bkb\" (UID: \"eb553ce8-f696-4c6b-a745-aa1faa5f9356\") " pod="openstack-operators/glance-operator-controller-manager-776c976b46-x7bkb" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.545438 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dswq\" (UniqueName: \"kubernetes.io/projected/e067c10a-e5d4-4e57-bf14-3b0bfc8ac069-kube-api-access-5dswq\") pod \"cinder-operator-controller-manager-859b6ccc6-n7mjb\" (UID: \"e067c10a-e5d4-4e57-bf14-3b0bfc8ac069\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-n7mjb" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.548360 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-2nm2t"] Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.559115 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-jcmlp"] Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.559341 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-2nm2t" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.562617 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-54jl5" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.567786 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-qxn7c" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.580981 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-2nm2t"] Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.595133 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-7stkq"] Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.596741 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-7stkq" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.603430 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htdws\" (UniqueName: \"kubernetes.io/projected/18526d53-2d4c-4c40-885c-c83b3b378260-kube-api-access-htdws\") pod \"barbican-operator-controller-manager-7d9dfd778-6d8qg\" (UID: \"18526d53-2d4c-4c40-885c-c83b3b378260\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-6d8qg" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.605709 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-hntm5" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.608266 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcdnt\" (UniqueName: \"kubernetes.io/projected/b01ea1d5-0409-4c32-bb34-1b88253ceb05-kube-api-access-tcdnt\") pod \"designate-operator-controller-manager-78b4bc895b-qbvvz\" (UID: \"b01ea1d5-0409-4c32-bb34-1b88253ceb05\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-qbvvz" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.621163 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dswq\" (UniqueName: \"kubernetes.io/projected/e067c10a-e5d4-4e57-bf14-3b0bfc8ac069-kube-api-access-5dswq\") pod \"cinder-operator-controller-manager-859b6ccc6-n7mjb\" (UID: \"e067c10a-e5d4-4e57-bf14-3b0bfc8ac069\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-n7mjb" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.646665 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xg49\" (UniqueName: \"kubernetes.io/projected/e5c436a3-2237-4f02-a9fc-b2aae90ce3b1-kube-api-access-4xg49\") pod \"horizon-operator-controller-manager-68c6d99b8f-kmtdj\" (UID: \"e5c436a3-2237-4f02-a9fc-b2aae90ce3b1\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kmtdj" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.646732 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7ckt\" (UniqueName: \"kubernetes.io/projected/8626cbeb-8604-4371-b936-99cab8d76742-kube-api-access-s7ckt\") pod \"heat-operator-controller-manager-5f64f6f8bb-5ftkv\" (UID: \"8626cbeb-8604-4371-b936-99cab8d76742\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5ftkv" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.646787 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f92lc\" (UniqueName: \"kubernetes.io/projected/0e1b6ed3-9b66-4279-9a9f-0685037df9c3-kube-api-access-f92lc\") pod \"keystone-operator-controller-manager-546d4bdf48-2nm2t\" (UID: \"0e1b6ed3-9b66-4279-9a9f-0685037df9c3\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-2nm2t" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.646821 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkc5g\" (UniqueName: \"kubernetes.io/projected/b40759e9-9a00-445c-964e-09f1d539d85e-kube-api-access-hkc5g\") pod \"infra-operator-controller-manager-57548d458d-jbfz7\" (UID: \"b40759e9-9a00-445c-964e-09f1d539d85e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-jbfz7" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.646849 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pl9r\" (UniqueName: \"kubernetes.io/projected/74aa06c0-a03f-4719-b751-a77ab3d472f2-kube-api-access-4pl9r\") pod \"ironic-operator-controller-manager-6c548fd776-jcmlp\" (UID: \"74aa06c0-a03f-4719-b751-a77ab3d472f2\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-jcmlp" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.646898 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45fnk\" (UniqueName: \"kubernetes.io/projected/eb553ce8-f696-4c6b-a745-aa1faa5f9356-kube-api-access-45fnk\") pod \"glance-operator-controller-manager-776c976b46-x7bkb\" (UID: \"eb553ce8-f696-4c6b-a745-aa1faa5f9356\") " pod="openstack-operators/glance-operator-controller-manager-776c976b46-x7bkb" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.646935 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b40759e9-9a00-445c-964e-09f1d539d85e-cert\") pod \"infra-operator-controller-manager-57548d458d-jbfz7\" (UID: \"b40759e9-9a00-445c-964e-09f1d539d85e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-jbfz7" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.655951 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-7stkq"] Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.697527 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-6d8qg" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.707149 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7ckt\" (UniqueName: \"kubernetes.io/projected/8626cbeb-8604-4371-b936-99cab8d76742-kube-api-access-s7ckt\") pod \"heat-operator-controller-manager-5f64f6f8bb-5ftkv\" (UID: \"8626cbeb-8604-4371-b936-99cab8d76742\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5ftkv" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.708091 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45fnk\" (UniqueName: \"kubernetes.io/projected/eb553ce8-f696-4c6b-a745-aa1faa5f9356-kube-api-access-45fnk\") pod \"glance-operator-controller-manager-776c976b46-x7bkb\" (UID: \"eb553ce8-f696-4c6b-a745-aa1faa5f9356\") " pod="openstack-operators/glance-operator-controller-manager-776c976b46-x7bkb" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.725038 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-qbvvz" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.748316 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xg49\" (UniqueName: \"kubernetes.io/projected/e5c436a3-2237-4f02-a9fc-b2aae90ce3b1-kube-api-access-4xg49\") pod \"horizon-operator-controller-manager-68c6d99b8f-kmtdj\" (UID: \"e5c436a3-2237-4f02-a9fc-b2aae90ce3b1\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kmtdj" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.748408 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6fvc\" (UniqueName: \"kubernetes.io/projected/1891b769-8e7e-4375-b3ea-421a23fb7af4-kube-api-access-p6fvc\") pod \"manila-operator-controller-manager-6546668bfd-7stkq\" (UID: \"1891b769-8e7e-4375-b3ea-421a23fb7af4\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-7stkq" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.748436 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f92lc\" (UniqueName: \"kubernetes.io/projected/0e1b6ed3-9b66-4279-9a9f-0685037df9c3-kube-api-access-f92lc\") pod \"keystone-operator-controller-manager-546d4bdf48-2nm2t\" (UID: \"0e1b6ed3-9b66-4279-9a9f-0685037df9c3\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-2nm2t" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.748465 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkc5g\" (UniqueName: \"kubernetes.io/projected/b40759e9-9a00-445c-964e-09f1d539d85e-kube-api-access-hkc5g\") pod \"infra-operator-controller-manager-57548d458d-jbfz7\" (UID: \"b40759e9-9a00-445c-964e-09f1d539d85e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-jbfz7" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.748501 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pl9r\" (UniqueName: \"kubernetes.io/projected/74aa06c0-a03f-4719-b751-a77ab3d472f2-kube-api-access-4pl9r\") pod \"ironic-operator-controller-manager-6c548fd776-jcmlp\" (UID: \"74aa06c0-a03f-4719-b751-a77ab3d472f2\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-jcmlp" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.748570 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b40759e9-9a00-445c-964e-09f1d539d85e-cert\") pod \"infra-operator-controller-manager-57548d458d-jbfz7\" (UID: \"b40759e9-9a00-445c-964e-09f1d539d85e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-jbfz7" Dec 01 20:14:10 crc kubenswrapper[4802]: E1201 20:14:10.748805 4802 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 20:14:10 crc kubenswrapper[4802]: E1201 20:14:10.748900 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b40759e9-9a00-445c-964e-09f1d539d85e-cert podName:b40759e9-9a00-445c-964e-09f1d539d85e nodeName:}" failed. No retries permitted until 2025-12-01 20:14:11.248871807 +0000 UTC m=+1072.811431448 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b40759e9-9a00-445c-964e-09f1d539d85e-cert") pod "infra-operator-controller-manager-57548d458d-jbfz7" (UID: "b40759e9-9a00-445c-964e-09f1d539d85e") : secret "infra-operator-webhook-server-cert" not found Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.750097 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cswfs"] Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.767576 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cswfs" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.769008 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-p475v"] Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.770626 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-p475v" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.775979 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cswfs"] Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.779867 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-gvn47" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.780611 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-776c976b46-x7bkb" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.786304 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-jtfns" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.800060 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f92lc\" (UniqueName: \"kubernetes.io/projected/0e1b6ed3-9b66-4279-9a9f-0685037df9c3-kube-api-access-f92lc\") pod \"keystone-operator-controller-manager-546d4bdf48-2nm2t\" (UID: \"0e1b6ed3-9b66-4279-9a9f-0685037df9c3\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-2nm2t" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.800233 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-p475v"] Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.803126 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5ftkv" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.806968 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xg49\" (UniqueName: \"kubernetes.io/projected/e5c436a3-2237-4f02-a9fc-b2aae90ce3b1-kube-api-access-4xg49\") pod \"horizon-operator-controller-manager-68c6d99b8f-kmtdj\" (UID: \"e5c436a3-2237-4f02-a9fc-b2aae90ce3b1\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kmtdj" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.807099 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pl9r\" (UniqueName: \"kubernetes.io/projected/74aa06c0-a03f-4719-b751-a77ab3d472f2-kube-api-access-4pl9r\") pod \"ironic-operator-controller-manager-6c548fd776-jcmlp\" (UID: \"74aa06c0-a03f-4719-b751-a77ab3d472f2\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-jcmlp" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.816877 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kmtdj" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.818310 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkc5g\" (UniqueName: \"kubernetes.io/projected/b40759e9-9a00-445c-964e-09f1d539d85e-kube-api-access-hkc5g\") pod \"infra-operator-controller-manager-57548d458d-jbfz7\" (UID: \"b40759e9-9a00-445c-964e-09f1d539d85e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-jbfz7" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.839358 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-djvhl"] Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.848472 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-djvhl" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.859267 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-vbzmc" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.863605 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cxd4\" (UniqueName: \"kubernetes.io/projected/e58c799d-fcaa-4d9b-aa6c-c8947774bd2e-kube-api-access-7cxd4\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-cswfs\" (UID: \"e58c799d-fcaa-4d9b-aa6c-c8947774bd2e\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cswfs" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.863831 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d22fw\" (UniqueName: \"kubernetes.io/projected/6973effc-3f05-43cd-ba03-b9efe3b6db1d-kube-api-access-d22fw\") pod \"mariadb-operator-controller-manager-56bbcc9d85-p475v\" (UID: \"6973effc-3f05-43cd-ba03-b9efe3b6db1d\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-p475v" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.863883 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6fvc\" (UniqueName: \"kubernetes.io/projected/1891b769-8e7e-4375-b3ea-421a23fb7af4-kube-api-access-p6fvc\") pod \"manila-operator-controller-manager-6546668bfd-7stkq\" (UID: \"1891b769-8e7e-4375-b3ea-421a23fb7af4\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-7stkq" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.871800 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-vb97q"] Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.873269 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-vb97q" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.877085 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-tb56n" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.895154 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-jcmlp" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.917794 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6fvc\" (UniqueName: \"kubernetes.io/projected/1891b769-8e7e-4375-b3ea-421a23fb7af4-kube-api-access-p6fvc\") pod \"manila-operator-controller-manager-6546668bfd-7stkq\" (UID: \"1891b769-8e7e-4375-b3ea-421a23fb7af4\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-7stkq" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.926709 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-n7mjb" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.968923 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99z68\" (UniqueName: \"kubernetes.io/projected/8e5dddc5-34ff-4a71-a626-3c9cea7ef30f-kube-api-access-99z68\") pod \"octavia-operator-controller-manager-998648c74-vb97q\" (UID: \"8e5dddc5-34ff-4a71-a626-3c9cea7ef30f\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-vb97q" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.969625 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cxd4\" (UniqueName: \"kubernetes.io/projected/e58c799d-fcaa-4d9b-aa6c-c8947774bd2e-kube-api-access-7cxd4\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-cswfs\" (UID: \"e58c799d-fcaa-4d9b-aa6c-c8947774bd2e\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cswfs" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.970261 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-2nm2t" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.970656 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d22fw\" (UniqueName: \"kubernetes.io/projected/6973effc-3f05-43cd-ba03-b9efe3b6db1d-kube-api-access-d22fw\") pod \"mariadb-operator-controller-manager-56bbcc9d85-p475v\" (UID: \"6973effc-3f05-43cd-ba03-b9efe3b6db1d\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-p475v" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.970889 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw7x8\" (UniqueName: \"kubernetes.io/projected/c7839b31-af95-4d33-a954-9615ea0c87a6-kube-api-access-pw7x8\") pod \"nova-operator-controller-manager-697bc559fc-djvhl\" (UID: \"c7839b31-af95-4d33-a954-9615ea0c87a6\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-djvhl" Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.971243 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-djvhl"] Dec 01 20:14:10 crc kubenswrapper[4802]: I1201 20:14:10.983472 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4552rv"] Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.019118 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4552rv" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.027846 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.030114 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cxd4\" (UniqueName: \"kubernetes.io/projected/e58c799d-fcaa-4d9b-aa6c-c8947774bd2e-kube-api-access-7cxd4\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-cswfs\" (UID: \"e58c799d-fcaa-4d9b-aa6c-c8947774bd2e\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cswfs" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.029326 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-z29b9" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.035248 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d22fw\" (UniqueName: \"kubernetes.io/projected/6973effc-3f05-43cd-ba03-b9efe3b6db1d-kube-api-access-d22fw\") pod \"mariadb-operator-controller-manager-56bbcc9d85-p475v\" (UID: \"6973effc-3f05-43cd-ba03-b9efe3b6db1d\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-p475v" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.040000 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-vb97q"] Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.067070 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-lfh62"] Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.075499 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99z68\" (UniqueName: \"kubernetes.io/projected/8e5dddc5-34ff-4a71-a626-3c9cea7ef30f-kube-api-access-99z68\") pod \"octavia-operator-controller-manager-998648c74-vb97q\" (UID: \"8e5dddc5-34ff-4a71-a626-3c9cea7ef30f\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-vb97q" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.075664 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw7x8\" (UniqueName: \"kubernetes.io/projected/c7839b31-af95-4d33-a954-9615ea0c87a6-kube-api-access-pw7x8\") pod \"nova-operator-controller-manager-697bc559fc-djvhl\" (UID: \"c7839b31-af95-4d33-a954-9615ea0c87a6\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-djvhl" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.075712 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-888pg\" (UniqueName: \"kubernetes.io/projected/d12b9eb3-946b-4578-8630-4cb6643ab36f-kube-api-access-888pg\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4552rv\" (UID: \"d12b9eb3-946b-4578-8630-4cb6643ab36f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4552rv" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.075743 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d12b9eb3-946b-4578-8630-4cb6643ab36f-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4552rv\" (UID: \"d12b9eb3-946b-4578-8630-4cb6643ab36f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4552rv" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.091958 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-7stkq" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.098616 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw7x8\" (UniqueName: \"kubernetes.io/projected/c7839b31-af95-4d33-a954-9615ea0c87a6-kube-api-access-pw7x8\") pod \"nova-operator-controller-manager-697bc559fc-djvhl\" (UID: \"c7839b31-af95-4d33-a954-9615ea0c87a6\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-djvhl" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.098818 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-lfh62"] Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.098858 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4552rv"] Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.098873 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-45m8m"] Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.099777 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-45m8m" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.100267 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-lfh62" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.115632 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-nc7wz" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.115708 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-qp6wf" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.116786 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cswfs" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.127209 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99z68\" (UniqueName: \"kubernetes.io/projected/8e5dddc5-34ff-4a71-a626-3c9cea7ef30f-kube-api-access-99z68\") pod \"octavia-operator-controller-manager-998648c74-vb97q\" (UID: \"8e5dddc5-34ff-4a71-a626-3c9cea7ef30f\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-vb97q" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.127319 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-45m8m"] Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.128145 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-p475v" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.153747 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-trddt"] Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.176786 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qccf8\" (UniqueName: \"kubernetes.io/projected/0934b0fd-8a48-4dee-b668-08c7b631551f-kube-api-access-qccf8\") pod \"ovn-operator-controller-manager-b6456fdb6-lfh62\" (UID: \"0934b0fd-8a48-4dee-b668-08c7b631551f\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-lfh62" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.176829 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cddv7\" (UniqueName: \"kubernetes.io/projected/088be214-85a6-4cb1-9e02-fcde44abb492-kube-api-access-cddv7\") pod \"placement-operator-controller-manager-78f8948974-45m8m\" (UID: \"088be214-85a6-4cb1-9e02-fcde44abb492\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-45m8m" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.176864 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-888pg\" (UniqueName: \"kubernetes.io/projected/d12b9eb3-946b-4578-8630-4cb6643ab36f-kube-api-access-888pg\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4552rv\" (UID: \"d12b9eb3-946b-4578-8630-4cb6643ab36f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4552rv" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.176882 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d12b9eb3-946b-4578-8630-4cb6643ab36f-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4552rv\" (UID: \"d12b9eb3-946b-4578-8630-4cb6643ab36f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4552rv" Dec 01 20:14:11 crc kubenswrapper[4802]: E1201 20:14:11.177035 4802 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 20:14:11 crc kubenswrapper[4802]: E1201 20:14:11.177098 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d12b9eb3-946b-4578-8630-4cb6643ab36f-cert podName:d12b9eb3-946b-4578-8630-4cb6643ab36f nodeName:}" failed. No retries permitted until 2025-12-01 20:14:11.677079836 +0000 UTC m=+1073.239639477 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d12b9eb3-946b-4578-8630-4cb6643ab36f-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4552rv" (UID: "d12b9eb3-946b-4578-8630-4cb6643ab36f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.178543 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8b6ht"] Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.178776 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-trddt" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.181790 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-trddt"] Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.182279 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-cr6lg" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.182507 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8b6ht" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.192666 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-qcswm" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.204856 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-888pg\" (UniqueName: \"kubernetes.io/projected/d12b9eb3-946b-4578-8630-4cb6643ab36f-kube-api-access-888pg\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4552rv\" (UID: \"d12b9eb3-946b-4578-8630-4cb6643ab36f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4552rv" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.207687 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8b6ht"] Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.214233 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-wx6ct"] Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.217308 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wx6ct" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.221157 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-jpzt5" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.236321 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-djvhl" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.251509 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-vb97q" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.261067 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-r89vq"] Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.262980 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r89vq" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.269549 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-7sxb2" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.279100 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f4jd\" (UniqueName: \"kubernetes.io/projected/92132c51-643c-4442-adf2-897bd2825fdf-kube-api-access-4f4jd\") pod \"test-operator-controller-manager-5854674fcc-wx6ct\" (UID: \"92132c51-643c-4442-adf2-897bd2825fdf\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-wx6ct" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.279156 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qccf8\" (UniqueName: \"kubernetes.io/projected/0934b0fd-8a48-4dee-b668-08c7b631551f-kube-api-access-qccf8\") pod \"ovn-operator-controller-manager-b6456fdb6-lfh62\" (UID: \"0934b0fd-8a48-4dee-b668-08c7b631551f\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-lfh62" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.279177 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cddv7\" (UniqueName: \"kubernetes.io/projected/088be214-85a6-4cb1-9e02-fcde44abb492-kube-api-access-cddv7\") pod \"placement-operator-controller-manager-78f8948974-45m8m\" (UID: \"088be214-85a6-4cb1-9e02-fcde44abb492\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-45m8m" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.279247 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b40759e9-9a00-445c-964e-09f1d539d85e-cert\") pod \"infra-operator-controller-manager-57548d458d-jbfz7\" (UID: \"b40759e9-9a00-445c-964e-09f1d539d85e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-jbfz7" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.279278 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcnwx\" (UniqueName: \"kubernetes.io/projected/bb89e7bb-899f-4f3e-80cd-833fbc74db85-kube-api-access-jcnwx\") pod \"swift-operator-controller-manager-5f8c65bbfc-trddt\" (UID: \"bb89e7bb-899f-4f3e-80cd-833fbc74db85\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-trddt" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.279320 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-972l4\" (UniqueName: \"kubernetes.io/projected/369e7da7-22d9-470f-9ad0-48472ceffde4-kube-api-access-972l4\") pod \"telemetry-operator-controller-manager-76cc84c6bb-8b6ht\" (UID: \"369e7da7-22d9-470f-9ad0-48472ceffde4\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8b6ht" Dec 01 20:14:11 crc kubenswrapper[4802]: E1201 20:14:11.279615 4802 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 20:14:11 crc kubenswrapper[4802]: E1201 20:14:11.279655 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b40759e9-9a00-445c-964e-09f1d539d85e-cert podName:b40759e9-9a00-445c-964e-09f1d539d85e nodeName:}" failed. No retries permitted until 2025-12-01 20:14:12.27963922 +0000 UTC m=+1073.842198861 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b40759e9-9a00-445c-964e-09f1d539d85e-cert") pod "infra-operator-controller-manager-57548d458d-jbfz7" (UID: "b40759e9-9a00-445c-964e-09f1d539d85e") : secret "infra-operator-webhook-server-cert" not found Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.289822 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-wx6ct"] Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.301133 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-r89vq"] Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.320659 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qccf8\" (UniqueName: \"kubernetes.io/projected/0934b0fd-8a48-4dee-b668-08c7b631551f-kube-api-access-qccf8\") pod \"ovn-operator-controller-manager-b6456fdb6-lfh62\" (UID: \"0934b0fd-8a48-4dee-b668-08c7b631551f\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-lfh62" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.327540 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr"] Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.330131 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.334885 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.335325 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-8h28j" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.335569 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.338267 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr"] Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.344279 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cddv7\" (UniqueName: \"kubernetes.io/projected/088be214-85a6-4cb1-9e02-fcde44abb492-kube-api-access-cddv7\") pod \"placement-operator-controller-manager-78f8948974-45m8m\" (UID: \"088be214-85a6-4cb1-9e02-fcde44abb492\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-45m8m" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.381155 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-metrics-certs\") pod \"openstack-operator-controller-manager-79774867dd-5sjpr\" (UID: \"aebbca29-71df-4bef-8108-66b226259a58\") " pod="openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.381235 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dbvh\" (UniqueName: \"kubernetes.io/projected/97d3762b-15ce-45aa-9767-5be47c85e039-kube-api-access-4dbvh\") pod \"watcher-operator-controller-manager-769dc69bc-r89vq\" (UID: \"97d3762b-15ce-45aa-9767-5be47c85e039\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r89vq" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.381291 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-webhook-certs\") pod \"openstack-operator-controller-manager-79774867dd-5sjpr\" (UID: \"aebbca29-71df-4bef-8108-66b226259a58\") " pod="openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.381501 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsvc2\" (UniqueName: \"kubernetes.io/projected/aebbca29-71df-4bef-8108-66b226259a58-kube-api-access-nsvc2\") pod \"openstack-operator-controller-manager-79774867dd-5sjpr\" (UID: \"aebbca29-71df-4bef-8108-66b226259a58\") " pod="openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.381534 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcnwx\" (UniqueName: \"kubernetes.io/projected/bb89e7bb-899f-4f3e-80cd-833fbc74db85-kube-api-access-jcnwx\") pod \"swift-operator-controller-manager-5f8c65bbfc-trddt\" (UID: \"bb89e7bb-899f-4f3e-80cd-833fbc74db85\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-trddt" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.381587 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-972l4\" (UniqueName: \"kubernetes.io/projected/369e7da7-22d9-470f-9ad0-48472ceffde4-kube-api-access-972l4\") pod \"telemetry-operator-controller-manager-76cc84c6bb-8b6ht\" (UID: \"369e7da7-22d9-470f-9ad0-48472ceffde4\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8b6ht" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.381619 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f4jd\" (UniqueName: \"kubernetes.io/projected/92132c51-643c-4442-adf2-897bd2825fdf-kube-api-access-4f4jd\") pod \"test-operator-controller-manager-5854674fcc-wx6ct\" (UID: \"92132c51-643c-4442-adf2-897bd2825fdf\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-wx6ct" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.411994 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f4jd\" (UniqueName: \"kubernetes.io/projected/92132c51-643c-4442-adf2-897bd2825fdf-kube-api-access-4f4jd\") pod \"test-operator-controller-manager-5854674fcc-wx6ct\" (UID: \"92132c51-643c-4442-adf2-897bd2825fdf\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-wx6ct" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.420492 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcnwx\" (UniqueName: \"kubernetes.io/projected/bb89e7bb-899f-4f3e-80cd-833fbc74db85-kube-api-access-jcnwx\") pod \"swift-operator-controller-manager-5f8c65bbfc-trddt\" (UID: \"bb89e7bb-899f-4f3e-80cd-833fbc74db85\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-trddt" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.430542 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-972l4\" (UniqueName: \"kubernetes.io/projected/369e7da7-22d9-470f-9ad0-48472ceffde4-kube-api-access-972l4\") pod \"telemetry-operator-controller-manager-76cc84c6bb-8b6ht\" (UID: \"369e7da7-22d9-470f-9ad0-48472ceffde4\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8b6ht" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.470328 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-45m8m" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.470844 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mnwdn"] Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.471845 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mnwdn" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.485987 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-6d56j" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.487212 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsvc2\" (UniqueName: \"kubernetes.io/projected/aebbca29-71df-4bef-8108-66b226259a58-kube-api-access-nsvc2\") pod \"openstack-operator-controller-manager-79774867dd-5sjpr\" (UID: \"aebbca29-71df-4bef-8108-66b226259a58\") " pod="openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.487315 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-metrics-certs\") pod \"openstack-operator-controller-manager-79774867dd-5sjpr\" (UID: \"aebbca29-71df-4bef-8108-66b226259a58\") " pod="openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.487340 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dbvh\" (UniqueName: \"kubernetes.io/projected/97d3762b-15ce-45aa-9767-5be47c85e039-kube-api-access-4dbvh\") pod \"watcher-operator-controller-manager-769dc69bc-r89vq\" (UID: \"97d3762b-15ce-45aa-9767-5be47c85e039\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r89vq" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.487362 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-webhook-certs\") pod \"openstack-operator-controller-manager-79774867dd-5sjpr\" (UID: \"aebbca29-71df-4bef-8108-66b226259a58\") " pod="openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr" Dec 01 20:14:11 crc kubenswrapper[4802]: E1201 20:14:11.487481 4802 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 20:14:11 crc kubenswrapper[4802]: E1201 20:14:11.487529 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-webhook-certs podName:aebbca29-71df-4bef-8108-66b226259a58 nodeName:}" failed. No retries permitted until 2025-12-01 20:14:11.987511445 +0000 UTC m=+1073.550071086 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-webhook-certs") pod "openstack-operator-controller-manager-79774867dd-5sjpr" (UID: "aebbca29-71df-4bef-8108-66b226259a58") : secret "webhook-server-cert" not found Dec 01 20:14:11 crc kubenswrapper[4802]: E1201 20:14:11.490857 4802 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 20:14:11 crc kubenswrapper[4802]: E1201 20:14:11.491017 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-metrics-certs podName:aebbca29-71df-4bef-8108-66b226259a58 nodeName:}" failed. No retries permitted until 2025-12-01 20:14:11.990983994 +0000 UTC m=+1073.553543635 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-metrics-certs") pod "openstack-operator-controller-manager-79774867dd-5sjpr" (UID: "aebbca29-71df-4bef-8108-66b226259a58") : secret "metrics-server-cert" not found Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.498295 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mnwdn"] Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.517873 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dbvh\" (UniqueName: \"kubernetes.io/projected/97d3762b-15ce-45aa-9767-5be47c85e039-kube-api-access-4dbvh\") pod \"watcher-operator-controller-manager-769dc69bc-r89vq\" (UID: \"97d3762b-15ce-45aa-9767-5be47c85e039\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r89vq" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.523092 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-lfh62" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.527723 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsvc2\" (UniqueName: \"kubernetes.io/projected/aebbca29-71df-4bef-8108-66b226259a58-kube-api-access-nsvc2\") pod \"openstack-operator-controller-manager-79774867dd-5sjpr\" (UID: \"aebbca29-71df-4bef-8108-66b226259a58\") " pod="openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.581045 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-trddt" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.584220 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8b6ht" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.596132 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz47n\" (UniqueName: \"kubernetes.io/projected/a3350b6c-2091-4a61-a78e-5a1bcdfd11cf-kube-api-access-fz47n\") pod \"rabbitmq-cluster-operator-manager-668c99d594-mnwdn\" (UID: \"a3350b6c-2091-4a61-a78e-5a1bcdfd11cf\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mnwdn" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.612021 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r89vq" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.617892 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wx6ct" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.691907 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-6d8qg"] Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.699396 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz47n\" (UniqueName: \"kubernetes.io/projected/a3350b6c-2091-4a61-a78e-5a1bcdfd11cf-kube-api-access-fz47n\") pod \"rabbitmq-cluster-operator-manager-668c99d594-mnwdn\" (UID: \"a3350b6c-2091-4a61-a78e-5a1bcdfd11cf\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mnwdn" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.699570 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d12b9eb3-946b-4578-8630-4cb6643ab36f-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4552rv\" (UID: \"d12b9eb3-946b-4578-8630-4cb6643ab36f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4552rv" Dec 01 20:14:11 crc kubenswrapper[4802]: E1201 20:14:11.700051 4802 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 20:14:11 crc kubenswrapper[4802]: E1201 20:14:11.722389 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d12b9eb3-946b-4578-8630-4cb6643ab36f-cert podName:d12b9eb3-946b-4578-8630-4cb6643ab36f nodeName:}" failed. No retries permitted until 2025-12-01 20:14:12.700111147 +0000 UTC m=+1074.262670788 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d12b9eb3-946b-4578-8630-4cb6643ab36f-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4552rv" (UID: "d12b9eb3-946b-4578-8630-4cb6643ab36f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.764632 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz47n\" (UniqueName: \"kubernetes.io/projected/a3350b6c-2091-4a61-a78e-5a1bcdfd11cf-kube-api-access-fz47n\") pod \"rabbitmq-cluster-operator-manager-668c99d594-mnwdn\" (UID: \"a3350b6c-2091-4a61-a78e-5a1bcdfd11cf\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mnwdn" Dec 01 20:14:11 crc kubenswrapper[4802]: W1201 20:14:11.775752 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18526d53_2d4c_4c40_885c_c83b3b378260.slice/crio-7832604dd092116584a97db6712507afeae80f83e5dbd617da8093d79c976989 WatchSource:0}: Error finding container 7832604dd092116584a97db6712507afeae80f83e5dbd617da8093d79c976989: Status 404 returned error can't find the container with id 7832604dd092116584a97db6712507afeae80f83e5dbd617da8093d79c976989 Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.821093 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mnwdn" Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.921128 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-qbvvz"] Dec 01 20:14:11 crc kubenswrapper[4802]: I1201 20:14:11.986912 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-6d8qg" event={"ID":"18526d53-2d4c-4c40-885c-c83b3b378260","Type":"ContainerStarted","Data":"7832604dd092116584a97db6712507afeae80f83e5dbd617da8093d79c976989"} Dec 01 20:14:12 crc kubenswrapper[4802]: I1201 20:14:12.014340 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-metrics-certs\") pod \"openstack-operator-controller-manager-79774867dd-5sjpr\" (UID: \"aebbca29-71df-4bef-8108-66b226259a58\") " pod="openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr" Dec 01 20:14:12 crc kubenswrapper[4802]: I1201 20:14:12.014717 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-webhook-certs\") pod \"openstack-operator-controller-manager-79774867dd-5sjpr\" (UID: \"aebbca29-71df-4bef-8108-66b226259a58\") " pod="openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr" Dec 01 20:14:12 crc kubenswrapper[4802]: E1201 20:14:12.014966 4802 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 20:14:12 crc kubenswrapper[4802]: E1201 20:14:12.015120 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-webhook-certs podName:aebbca29-71df-4bef-8108-66b226259a58 nodeName:}" failed. No retries permitted until 2025-12-01 20:14:13.015090268 +0000 UTC m=+1074.577649909 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-webhook-certs") pod "openstack-operator-controller-manager-79774867dd-5sjpr" (UID: "aebbca29-71df-4bef-8108-66b226259a58") : secret "webhook-server-cert" not found Dec 01 20:14:12 crc kubenswrapper[4802]: E1201 20:14:12.015760 4802 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 20:14:12 crc kubenswrapper[4802]: E1201 20:14:12.015863 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-metrics-certs podName:aebbca29-71df-4bef-8108-66b226259a58 nodeName:}" failed. No retries permitted until 2025-12-01 20:14:13.015850612 +0000 UTC m=+1074.578410253 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-metrics-certs") pod "openstack-operator-controller-manager-79774867dd-5sjpr" (UID: "aebbca29-71df-4bef-8108-66b226259a58") : secret "metrics-server-cert" not found Dec 01 20:14:12 crc kubenswrapper[4802]: I1201 20:14:12.119244 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-776c976b46-x7bkb"] Dec 01 20:14:12 crc kubenswrapper[4802]: W1201 20:14:12.124408 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb553ce8_f696_4c6b_a745_aa1faa5f9356.slice/crio-efb283544c8376af6593065d96ce2dbaaa9e1681dd99a431ca2300c4e11646c1 WatchSource:0}: Error finding container efb283544c8376af6593065d96ce2dbaaa9e1681dd99a431ca2300c4e11646c1: Status 404 returned error can't find the container with id efb283544c8376af6593065d96ce2dbaaa9e1681dd99a431ca2300c4e11646c1 Dec 01 20:14:12 crc kubenswrapper[4802]: W1201 20:14:12.125281 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb01ea1d5_0409_4c32_bb34_1b88253ceb05.slice/crio-051a9adb35afb9cac1a5d30199440d209c118f300c84bcd6cfaf3949517bf783 WatchSource:0}: Error finding container 051a9adb35afb9cac1a5d30199440d209c118f300c84bcd6cfaf3949517bf783: Status 404 returned error can't find the container with id 051a9adb35afb9cac1a5d30199440d209c118f300c84bcd6cfaf3949517bf783 Dec 01 20:14:12 crc kubenswrapper[4802]: I1201 20:14:12.323581 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b40759e9-9a00-445c-964e-09f1d539d85e-cert\") pod \"infra-operator-controller-manager-57548d458d-jbfz7\" (UID: \"b40759e9-9a00-445c-964e-09f1d539d85e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-jbfz7" Dec 01 20:14:12 crc kubenswrapper[4802]: E1201 20:14:12.325980 4802 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 20:14:12 crc kubenswrapper[4802]: E1201 20:14:12.326100 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b40759e9-9a00-445c-964e-09f1d539d85e-cert podName:b40759e9-9a00-445c-964e-09f1d539d85e nodeName:}" failed. No retries permitted until 2025-12-01 20:14:14.326064934 +0000 UTC m=+1075.888624745 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b40759e9-9a00-445c-964e-09f1d539d85e-cert") pod "infra-operator-controller-manager-57548d458d-jbfz7" (UID: "b40759e9-9a00-445c-964e-09f1d539d85e") : secret "infra-operator-webhook-server-cert" not found Dec 01 20:14:12 crc kubenswrapper[4802]: I1201 20:14:12.596524 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-n7mjb"] Dec 01 20:14:12 crc kubenswrapper[4802]: I1201 20:14:12.607076 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kmtdj"] Dec 01 20:14:12 crc kubenswrapper[4802]: I1201 20:14:12.628033 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-jcmlp"] Dec 01 20:14:12 crc kubenswrapper[4802]: W1201 20:14:12.651823 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74aa06c0_a03f_4719_b751_a77ab3d472f2.slice/crio-b2f633dcaf774b26898c7dd5199eebcf0c14bb0f8e6638db315b2d0c1b62daa3 WatchSource:0}: Error finding container b2f633dcaf774b26898c7dd5199eebcf0c14bb0f8e6638db315b2d0c1b62daa3: Status 404 returned error can't find the container with id b2f633dcaf774b26898c7dd5199eebcf0c14bb0f8e6638db315b2d0c1b62daa3 Dec 01 20:14:12 crc kubenswrapper[4802]: I1201 20:14:12.731703 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d12b9eb3-946b-4578-8630-4cb6643ab36f-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4552rv\" (UID: \"d12b9eb3-946b-4578-8630-4cb6643ab36f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4552rv" Dec 01 20:14:12 crc kubenswrapper[4802]: E1201 20:14:12.732624 4802 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 20:14:12 crc kubenswrapper[4802]: E1201 20:14:12.732770 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d12b9eb3-946b-4578-8630-4cb6643ab36f-cert podName:d12b9eb3-946b-4578-8630-4cb6643ab36f nodeName:}" failed. No retries permitted until 2025-12-01 20:14:14.732726628 +0000 UTC m=+1076.295286269 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d12b9eb3-946b-4578-8630-4cb6643ab36f-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4552rv" (UID: "d12b9eb3-946b-4578-8630-4cb6643ab36f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 20:14:12 crc kubenswrapper[4802]: I1201 20:14:12.754486 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5ftkv"] Dec 01 20:14:12 crc kubenswrapper[4802]: W1201 20:14:12.776970 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8626cbeb_8604_4371_b936_99cab8d76742.slice/crio-b3f7cc11db1591585af194c90484fc8ffe49bdba6c02c14b518215316d4c5216 WatchSource:0}: Error finding container b3f7cc11db1591585af194c90484fc8ffe49bdba6c02c14b518215316d4c5216: Status 404 returned error can't find the container with id b3f7cc11db1591585af194c90484fc8ffe49bdba6c02c14b518215316d4c5216 Dec 01 20:14:13 crc kubenswrapper[4802]: I1201 20:14:12.999417 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-n7mjb" event={"ID":"e067c10a-e5d4-4e57-bf14-3b0bfc8ac069","Type":"ContainerStarted","Data":"1f60d9acd7962fb6248f1c697d4678b9745677335dde40928cd4012438e6929d"} Dec 01 20:14:13 crc kubenswrapper[4802]: I1201 20:14:13.001417 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-qbvvz" event={"ID":"b01ea1d5-0409-4c32-bb34-1b88253ceb05","Type":"ContainerStarted","Data":"051a9adb35afb9cac1a5d30199440d209c118f300c84bcd6cfaf3949517bf783"} Dec 01 20:14:13 crc kubenswrapper[4802]: I1201 20:14:13.010564 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5ftkv" event={"ID":"8626cbeb-8604-4371-b936-99cab8d76742","Type":"ContainerStarted","Data":"b3f7cc11db1591585af194c90484fc8ffe49bdba6c02c14b518215316d4c5216"} Dec 01 20:14:13 crc kubenswrapper[4802]: I1201 20:14:13.011855 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-2nm2t"] Dec 01 20:14:13 crc kubenswrapper[4802]: I1201 20:14:13.012681 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-776c976b46-x7bkb" event={"ID":"eb553ce8-f696-4c6b-a745-aa1faa5f9356","Type":"ContainerStarted","Data":"efb283544c8376af6593065d96ce2dbaaa9e1681dd99a431ca2300c4e11646c1"} Dec 01 20:14:13 crc kubenswrapper[4802]: I1201 20:14:13.015816 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-jcmlp" event={"ID":"74aa06c0-a03f-4719-b751-a77ab3d472f2","Type":"ContainerStarted","Data":"b2f633dcaf774b26898c7dd5199eebcf0c14bb0f8e6638db315b2d0c1b62daa3"} Dec 01 20:14:13 crc kubenswrapper[4802]: W1201 20:14:13.030401 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e1b6ed3_9b66_4279_9a9f_0685037df9c3.slice/crio-8614eae2adae75525cb1d9a626bf5538a09744216d46d131593b5457165a0e39 WatchSource:0}: Error finding container 8614eae2adae75525cb1d9a626bf5538a09744216d46d131593b5457165a0e39: Status 404 returned error can't find the container with id 8614eae2adae75525cb1d9a626bf5538a09744216d46d131593b5457165a0e39 Dec 01 20:14:13 crc kubenswrapper[4802]: I1201 20:14:13.032300 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kmtdj" event={"ID":"e5c436a3-2237-4f02-a9fc-b2aae90ce3b1","Type":"ContainerStarted","Data":"3607351118b1093dd26b5a7be7ca0f702a77337154648f93a1a2f9e8175c0ae9"} Dec 01 20:14:13 crc kubenswrapper[4802]: I1201 20:14:13.047088 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-metrics-certs\") pod \"openstack-operator-controller-manager-79774867dd-5sjpr\" (UID: \"aebbca29-71df-4bef-8108-66b226259a58\") " pod="openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr" Dec 01 20:14:13 crc kubenswrapper[4802]: I1201 20:14:13.047138 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-webhook-certs\") pod \"openstack-operator-controller-manager-79774867dd-5sjpr\" (UID: \"aebbca29-71df-4bef-8108-66b226259a58\") " pod="openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr" Dec 01 20:14:13 crc kubenswrapper[4802]: E1201 20:14:13.047420 4802 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 20:14:13 crc kubenswrapper[4802]: E1201 20:14:13.047480 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-webhook-certs podName:aebbca29-71df-4bef-8108-66b226259a58 nodeName:}" failed. No retries permitted until 2025-12-01 20:14:15.047462932 +0000 UTC m=+1076.610022573 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-webhook-certs") pod "openstack-operator-controller-manager-79774867dd-5sjpr" (UID: "aebbca29-71df-4bef-8108-66b226259a58") : secret "webhook-server-cert" not found Dec 01 20:14:13 crc kubenswrapper[4802]: E1201 20:14:13.047594 4802 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 20:14:13 crc kubenswrapper[4802]: E1201 20:14:13.047703 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-metrics-certs podName:aebbca29-71df-4bef-8108-66b226259a58 nodeName:}" failed. No retries permitted until 2025-12-01 20:14:15.047673229 +0000 UTC m=+1076.610232860 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-metrics-certs") pod "openstack-operator-controller-manager-79774867dd-5sjpr" (UID: "aebbca29-71df-4bef-8108-66b226259a58") : secret "metrics-server-cert" not found Dec 01 20:14:13 crc kubenswrapper[4802]: I1201 20:14:13.077322 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-7stkq"] Dec 01 20:14:13 crc kubenswrapper[4802]: I1201 20:14:13.088965 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-45m8m"] Dec 01 20:14:13 crc kubenswrapper[4802]: I1201 20:14:13.112710 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cswfs"] Dec 01 20:14:13 crc kubenswrapper[4802]: W1201 20:14:13.116502 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod088be214_85a6_4cb1_9e02_fcde44abb492.slice/crio-17fc5d8c216c5172d9ebd83093402921cb156c72a34374d1035619b2bd4ed3fb WatchSource:0}: Error finding container 17fc5d8c216c5172d9ebd83093402921cb156c72a34374d1035619b2bd4ed3fb: Status 404 returned error can't find the container with id 17fc5d8c216c5172d9ebd83093402921cb156c72a34374d1035619b2bd4ed3fb Dec 01 20:14:13 crc kubenswrapper[4802]: W1201 20:14:13.138055 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e5dddc5_34ff_4a71_a626_3c9cea7ef30f.slice/crio-94ce741532bee3d5ddef62c140c4ed05d8a4ac14e439d82e9897ee405e9a644a WatchSource:0}: Error finding container 94ce741532bee3d5ddef62c140c4ed05d8a4ac14e439d82e9897ee405e9a644a: Status 404 returned error can't find the container with id 94ce741532bee3d5ddef62c140c4ed05d8a4ac14e439d82e9897ee405e9a644a Dec 01 20:14:13 crc kubenswrapper[4802]: I1201 20:14:13.140769 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-vb97q"] Dec 01 20:14:13 crc kubenswrapper[4802]: I1201 20:14:13.148000 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-p475v"] Dec 01 20:14:13 crc kubenswrapper[4802]: I1201 20:14:13.158262 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-lfh62"] Dec 01 20:14:13 crc kubenswrapper[4802]: W1201 20:14:13.163675 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7839b31_af95_4d33_a954_9615ea0c87a6.slice/crio-67a440178424ac77c07fb3e6f2b20581a83476571179010c93cd414a0f2d7148 WatchSource:0}: Error finding container 67a440178424ac77c07fb3e6f2b20581a83476571179010c93cd414a0f2d7148: Status 404 returned error can't find the container with id 67a440178424ac77c07fb3e6f2b20581a83476571179010c93cd414a0f2d7148 Dec 01 20:14:13 crc kubenswrapper[4802]: I1201 20:14:13.168514 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-r89vq"] Dec 01 20:14:13 crc kubenswrapper[4802]: W1201 20:14:13.169534 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97d3762b_15ce_45aa_9767_5be47c85e039.slice/crio-fb5af7a86e1fb67f919ac54a980c2426451e9ff36db6eae32d82e49d39cc83af WatchSource:0}: Error finding container fb5af7a86e1fb67f919ac54a980c2426451e9ff36db6eae32d82e49d39cc83af: Status 404 returned error can't find the container with id fb5af7a86e1fb67f919ac54a980c2426451e9ff36db6eae32d82e49d39cc83af Dec 01 20:14:13 crc kubenswrapper[4802]: W1201 20:14:13.178014 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0934b0fd_8a48_4dee_b668_08c7b631551f.slice/crio-d978992e4f93155385f41a0e5386af543339561dabe8740f099587f6fc5fbd62 WatchSource:0}: Error finding container d978992e4f93155385f41a0e5386af543339561dabe8740f099587f6fc5fbd62: Status 404 returned error can't find the container with id d978992e4f93155385f41a0e5386af543339561dabe8740f099587f6fc5fbd62 Dec 01 20:14:13 crc kubenswrapper[4802]: I1201 20:14:13.183594 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-djvhl"] Dec 01 20:14:13 crc kubenswrapper[4802]: W1201 20:14:13.190179 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb89e7bb_899f_4f3e_80cd_833fbc74db85.slice/crio-9d3e53d962353a71883ad976a14b5b14f51a9cbfa27c4b863545eb75891390d8 WatchSource:0}: Error finding container 9d3e53d962353a71883ad976a14b5b14f51a9cbfa27c4b863545eb75891390d8: Status 404 returned error can't find the container with id 9d3e53d962353a71883ad976a14b5b14f51a9cbfa27c4b863545eb75891390d8 Dec 01 20:14:13 crc kubenswrapper[4802]: W1201 20:14:13.194985 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod369e7da7_22d9_470f_9ad0_48472ceffde4.slice/crio-61da6c0dcb7b3cad7c6c08adde43059848272c0c0f6e8354d082cd3fa3d5cdaf WatchSource:0}: Error finding container 61da6c0dcb7b3cad7c6c08adde43059848272c0c0f6e8354d082cd3fa3d5cdaf: Status 404 returned error can't find the container with id 61da6c0dcb7b3cad7c6c08adde43059848272c0c0f6e8354d082cd3fa3d5cdaf Dec 01 20:14:13 crc kubenswrapper[4802]: I1201 20:14:13.197638 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8b6ht"] Dec 01 20:14:13 crc kubenswrapper[4802]: E1201 20:14:13.200721 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jcnwx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-trddt_openstack-operators(bb89e7bb-899f-4f3e-80cd-833fbc74db85): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 20:14:13 crc kubenswrapper[4802]: E1201 20:14:13.200777 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-972l4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-8b6ht_openstack-operators(369e7da7-22d9-470f-9ad0-48472ceffde4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 20:14:13 crc kubenswrapper[4802]: E1201 20:14:13.200886 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qccf8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-lfh62_openstack-operators(0934b0fd-8a48-4dee-b668-08c7b631551f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 20:14:13 crc kubenswrapper[4802]: I1201 20:14:13.203206 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-trddt"] Dec 01 20:14:13 crc kubenswrapper[4802]: E1201 20:14:13.203457 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qccf8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-lfh62_openstack-operators(0934b0fd-8a48-4dee-b668-08c7b631551f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 20:14:13 crc kubenswrapper[4802]: E1201 20:14:13.204474 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jcnwx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-trddt_openstack-operators(bb89e7bb-899f-4f3e-80cd-833fbc74db85): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 20:14:13 crc kubenswrapper[4802]: E1201 20:14:13.204533 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-lfh62" podUID="0934b0fd-8a48-4dee-b668-08c7b631551f" Dec 01 20:14:13 crc kubenswrapper[4802]: E1201 20:14:13.204701 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-972l4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-8b6ht_openstack-operators(369e7da7-22d9-470f-9ad0-48472ceffde4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 20:14:13 crc kubenswrapper[4802]: E1201 20:14:13.205912 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8b6ht" podUID="369e7da7-22d9-470f-9ad0-48472ceffde4" Dec 01 20:14:13 crc kubenswrapper[4802]: E1201 20:14:13.205963 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-trddt" podUID="bb89e7bb-899f-4f3e-80cd-833fbc74db85" Dec 01 20:14:13 crc kubenswrapper[4802]: I1201 20:14:13.321474 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-wx6ct"] Dec 01 20:14:13 crc kubenswrapper[4802]: E1201 20:14:13.343434 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4f4jd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-wx6ct_openstack-operators(92132c51-643c-4442-adf2-897bd2825fdf): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 20:14:13 crc kubenswrapper[4802]: I1201 20:14:13.344450 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mnwdn"] Dec 01 20:14:13 crc kubenswrapper[4802]: E1201 20:14:13.346151 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4f4jd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-wx6ct_openstack-operators(92132c51-643c-4442-adf2-897bd2825fdf): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 20:14:13 crc kubenswrapper[4802]: E1201 20:14:13.347538 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wx6ct" podUID="92132c51-643c-4442-adf2-897bd2825fdf" Dec 01 20:14:14 crc kubenswrapper[4802]: I1201 20:14:14.044104 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-2nm2t" event={"ID":"0e1b6ed3-9b66-4279-9a9f-0685037df9c3","Type":"ContainerStarted","Data":"8614eae2adae75525cb1d9a626bf5538a09744216d46d131593b5457165a0e39"} Dec 01 20:14:14 crc kubenswrapper[4802]: I1201 20:14:14.046979 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-p475v" event={"ID":"6973effc-3f05-43cd-ba03-b9efe3b6db1d","Type":"ContainerStarted","Data":"85235952aabbc2abefcdfff08f54fcadf2b70db8fc59f4dcdd12b89c2aa95699"} Dec 01 20:14:14 crc kubenswrapper[4802]: I1201 20:14:14.051847 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-vb97q" event={"ID":"8e5dddc5-34ff-4a71-a626-3c9cea7ef30f","Type":"ContainerStarted","Data":"94ce741532bee3d5ddef62c140c4ed05d8a4ac14e439d82e9897ee405e9a644a"} Dec 01 20:14:14 crc kubenswrapper[4802]: I1201 20:14:14.054190 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r89vq" event={"ID":"97d3762b-15ce-45aa-9767-5be47c85e039","Type":"ContainerStarted","Data":"fb5af7a86e1fb67f919ac54a980c2426451e9ff36db6eae32d82e49d39cc83af"} Dec 01 20:14:14 crc kubenswrapper[4802]: I1201 20:14:14.056556 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-45m8m" event={"ID":"088be214-85a6-4cb1-9e02-fcde44abb492","Type":"ContainerStarted","Data":"17fc5d8c216c5172d9ebd83093402921cb156c72a34374d1035619b2bd4ed3fb"} Dec 01 20:14:14 crc kubenswrapper[4802]: I1201 20:14:14.058535 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-djvhl" event={"ID":"c7839b31-af95-4d33-a954-9615ea0c87a6","Type":"ContainerStarted","Data":"67a440178424ac77c07fb3e6f2b20581a83476571179010c93cd414a0f2d7148"} Dec 01 20:14:14 crc kubenswrapper[4802]: I1201 20:14:14.060803 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wx6ct" event={"ID":"92132c51-643c-4442-adf2-897bd2825fdf","Type":"ContainerStarted","Data":"a894b0d2e4a19e51337445518761b0c8e8318dd8c6c565a14b8b4a7b6338db0a"} Dec 01 20:14:14 crc kubenswrapper[4802]: I1201 20:14:14.066664 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-7stkq" event={"ID":"1891b769-8e7e-4375-b3ea-421a23fb7af4","Type":"ContainerStarted","Data":"c17ae255cd301dbcd217a2834e4f067b0eb5113273bc96a80e3f4196da37ecb7"} Dec 01 20:14:14 crc kubenswrapper[4802]: E1201 20:14:14.066707 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wx6ct" podUID="92132c51-643c-4442-adf2-897bd2825fdf" Dec 01 20:14:14 crc kubenswrapper[4802]: I1201 20:14:14.069164 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8b6ht" event={"ID":"369e7da7-22d9-470f-9ad0-48472ceffde4","Type":"ContainerStarted","Data":"61da6c0dcb7b3cad7c6c08adde43059848272c0c0f6e8354d082cd3fa3d5cdaf"} Dec 01 20:14:14 crc kubenswrapper[4802]: E1201 20:14:14.072466 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8b6ht" podUID="369e7da7-22d9-470f-9ad0-48472ceffde4" Dec 01 20:14:14 crc kubenswrapper[4802]: I1201 20:14:14.073260 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cswfs" event={"ID":"e58c799d-fcaa-4d9b-aa6c-c8947774bd2e","Type":"ContainerStarted","Data":"9e37874e74038ebc802c82d1a22f92485ec09ccd59a822e556f45afd11e39f71"} Dec 01 20:14:14 crc kubenswrapper[4802]: I1201 20:14:14.075659 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mnwdn" event={"ID":"a3350b6c-2091-4a61-a78e-5a1bcdfd11cf","Type":"ContainerStarted","Data":"5f93986df71ba49d15aa91b5b2f5c7010e054f702102135938196fd419774e98"} Dec 01 20:14:14 crc kubenswrapper[4802]: I1201 20:14:14.078700 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-lfh62" event={"ID":"0934b0fd-8a48-4dee-b668-08c7b631551f","Type":"ContainerStarted","Data":"d978992e4f93155385f41a0e5386af543339561dabe8740f099587f6fc5fbd62"} Dec 01 20:14:14 crc kubenswrapper[4802]: I1201 20:14:14.081576 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-trddt" event={"ID":"bb89e7bb-899f-4f3e-80cd-833fbc74db85","Type":"ContainerStarted","Data":"9d3e53d962353a71883ad976a14b5b14f51a9cbfa27c4b863545eb75891390d8"} Dec 01 20:14:14 crc kubenswrapper[4802]: E1201 20:14:14.088657 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-lfh62" podUID="0934b0fd-8a48-4dee-b668-08c7b631551f" Dec 01 20:14:14 crc kubenswrapper[4802]: E1201 20:14:14.095794 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-trddt" podUID="bb89e7bb-899f-4f3e-80cd-833fbc74db85" Dec 01 20:14:14 crc kubenswrapper[4802]: I1201 20:14:14.377636 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b40759e9-9a00-445c-964e-09f1d539d85e-cert\") pod \"infra-operator-controller-manager-57548d458d-jbfz7\" (UID: \"b40759e9-9a00-445c-964e-09f1d539d85e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-jbfz7" Dec 01 20:14:14 crc kubenswrapper[4802]: E1201 20:14:14.377997 4802 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 20:14:14 crc kubenswrapper[4802]: E1201 20:14:14.378144 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b40759e9-9a00-445c-964e-09f1d539d85e-cert podName:b40759e9-9a00-445c-964e-09f1d539d85e nodeName:}" failed. No retries permitted until 2025-12-01 20:14:18.378098762 +0000 UTC m=+1079.940658403 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b40759e9-9a00-445c-964e-09f1d539d85e-cert") pod "infra-operator-controller-manager-57548d458d-jbfz7" (UID: "b40759e9-9a00-445c-964e-09f1d539d85e") : secret "infra-operator-webhook-server-cert" not found Dec 01 20:14:14 crc kubenswrapper[4802]: I1201 20:14:14.792717 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d12b9eb3-946b-4578-8630-4cb6643ab36f-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4552rv\" (UID: \"d12b9eb3-946b-4578-8630-4cb6643ab36f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4552rv" Dec 01 20:14:14 crc kubenswrapper[4802]: E1201 20:14:14.793219 4802 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 20:14:14 crc kubenswrapper[4802]: E1201 20:14:14.793440 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d12b9eb3-946b-4578-8630-4cb6643ab36f-cert podName:d12b9eb3-946b-4578-8630-4cb6643ab36f nodeName:}" failed. No retries permitted until 2025-12-01 20:14:18.793405388 +0000 UTC m=+1080.355965119 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d12b9eb3-946b-4578-8630-4cb6643ab36f-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4552rv" (UID: "d12b9eb3-946b-4578-8630-4cb6643ab36f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 20:14:15 crc kubenswrapper[4802]: E1201 20:14:15.096215 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-trddt" podUID="bb89e7bb-899f-4f3e-80cd-833fbc74db85" Dec 01 20:14:15 crc kubenswrapper[4802]: E1201 20:14:15.096511 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wx6ct" podUID="92132c51-643c-4442-adf2-897bd2825fdf" Dec 01 20:14:15 crc kubenswrapper[4802]: E1201 20:14:15.097538 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8b6ht" podUID="369e7da7-22d9-470f-9ad0-48472ceffde4" Dec 01 20:14:15 crc kubenswrapper[4802]: E1201 20:14:15.097535 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-lfh62" podUID="0934b0fd-8a48-4dee-b668-08c7b631551f" Dec 01 20:14:15 crc kubenswrapper[4802]: I1201 20:14:15.098569 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-metrics-certs\") pod \"openstack-operator-controller-manager-79774867dd-5sjpr\" (UID: \"aebbca29-71df-4bef-8108-66b226259a58\") " pod="openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr" Dec 01 20:14:15 crc kubenswrapper[4802]: I1201 20:14:15.098619 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-webhook-certs\") pod \"openstack-operator-controller-manager-79774867dd-5sjpr\" (UID: \"aebbca29-71df-4bef-8108-66b226259a58\") " pod="openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr" Dec 01 20:14:15 crc kubenswrapper[4802]: E1201 20:14:15.098788 4802 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 20:14:15 crc kubenswrapper[4802]: E1201 20:14:15.098835 4802 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 20:14:15 crc kubenswrapper[4802]: E1201 20:14:15.098854 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-metrics-certs podName:aebbca29-71df-4bef-8108-66b226259a58 nodeName:}" failed. No retries permitted until 2025-12-01 20:14:19.09883132 +0000 UTC m=+1080.661390961 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-metrics-certs") pod "openstack-operator-controller-manager-79774867dd-5sjpr" (UID: "aebbca29-71df-4bef-8108-66b226259a58") : secret "metrics-server-cert" not found Dec 01 20:14:15 crc kubenswrapper[4802]: E1201 20:14:15.098877 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-webhook-certs podName:aebbca29-71df-4bef-8108-66b226259a58 nodeName:}" failed. No retries permitted until 2025-12-01 20:14:19.098864971 +0000 UTC m=+1080.661424612 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-webhook-certs") pod "openstack-operator-controller-manager-79774867dd-5sjpr" (UID: "aebbca29-71df-4bef-8108-66b226259a58") : secret "webhook-server-cert" not found Dec 01 20:14:18 crc kubenswrapper[4802]: I1201 20:14:18.397359 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b40759e9-9a00-445c-964e-09f1d539d85e-cert\") pod \"infra-operator-controller-manager-57548d458d-jbfz7\" (UID: \"b40759e9-9a00-445c-964e-09f1d539d85e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-jbfz7" Dec 01 20:14:18 crc kubenswrapper[4802]: E1201 20:14:18.397626 4802 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 20:14:18 crc kubenswrapper[4802]: E1201 20:14:18.398172 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b40759e9-9a00-445c-964e-09f1d539d85e-cert podName:b40759e9-9a00-445c-964e-09f1d539d85e nodeName:}" failed. No retries permitted until 2025-12-01 20:14:26.398142456 +0000 UTC m=+1087.960702097 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b40759e9-9a00-445c-964e-09f1d539d85e-cert") pod "infra-operator-controller-manager-57548d458d-jbfz7" (UID: "b40759e9-9a00-445c-964e-09f1d539d85e") : secret "infra-operator-webhook-server-cert" not found Dec 01 20:14:18 crc kubenswrapper[4802]: I1201 20:14:18.807189 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d12b9eb3-946b-4578-8630-4cb6643ab36f-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4552rv\" (UID: \"d12b9eb3-946b-4578-8630-4cb6643ab36f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4552rv" Dec 01 20:14:18 crc kubenswrapper[4802]: E1201 20:14:18.807548 4802 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 20:14:18 crc kubenswrapper[4802]: E1201 20:14:18.807887 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d12b9eb3-946b-4578-8630-4cb6643ab36f-cert podName:d12b9eb3-946b-4578-8630-4cb6643ab36f nodeName:}" failed. No retries permitted until 2025-12-01 20:14:26.807857966 +0000 UTC m=+1088.370417607 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d12b9eb3-946b-4578-8630-4cb6643ab36f-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4552rv" (UID: "d12b9eb3-946b-4578-8630-4cb6643ab36f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 20:14:19 crc kubenswrapper[4802]: I1201 20:14:19.112717 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-metrics-certs\") pod \"openstack-operator-controller-manager-79774867dd-5sjpr\" (UID: \"aebbca29-71df-4bef-8108-66b226259a58\") " pod="openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr" Dec 01 20:14:19 crc kubenswrapper[4802]: I1201 20:14:19.112786 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-webhook-certs\") pod \"openstack-operator-controller-manager-79774867dd-5sjpr\" (UID: \"aebbca29-71df-4bef-8108-66b226259a58\") " pod="openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr" Dec 01 20:14:19 crc kubenswrapper[4802]: E1201 20:14:19.112943 4802 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 20:14:19 crc kubenswrapper[4802]: E1201 20:14:19.113007 4802 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 20:14:19 crc kubenswrapper[4802]: E1201 20:14:19.113038 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-metrics-certs podName:aebbca29-71df-4bef-8108-66b226259a58 nodeName:}" failed. No retries permitted until 2025-12-01 20:14:27.113013239 +0000 UTC m=+1088.675572880 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-metrics-certs") pod "openstack-operator-controller-manager-79774867dd-5sjpr" (UID: "aebbca29-71df-4bef-8108-66b226259a58") : secret "metrics-server-cert" not found Dec 01 20:14:19 crc kubenswrapper[4802]: E1201 20:14:19.113091 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-webhook-certs podName:aebbca29-71df-4bef-8108-66b226259a58 nodeName:}" failed. No retries permitted until 2025-12-01 20:14:27.113066181 +0000 UTC m=+1088.675625822 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-webhook-certs") pod "openstack-operator-controller-manager-79774867dd-5sjpr" (UID: "aebbca29-71df-4bef-8108-66b226259a58") : secret "webhook-server-cert" not found Dec 01 20:14:25 crc kubenswrapper[4802]: E1201 20:14:25.180647 4802 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429" Dec 01 20:14:25 crc kubenswrapper[4802]: E1201 20:14:25.181541 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s7ckt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-5ftkv_openstack-operators(8626cbeb-8604-4371-b936-99cab8d76742): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 20:14:26 crc kubenswrapper[4802]: E1201 20:14:26.321417 4802 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530" Dec 01 20:14:26 crc kubenswrapper[4802]: E1201 20:14:26.321833 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4pl9r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6c548fd776-jcmlp_openstack-operators(74aa06c0-a03f-4719-b751-a77ab3d472f2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 20:14:26 crc kubenswrapper[4802]: I1201 20:14:26.409347 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b40759e9-9a00-445c-964e-09f1d539d85e-cert\") pod \"infra-operator-controller-manager-57548d458d-jbfz7\" (UID: \"b40759e9-9a00-445c-964e-09f1d539d85e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-jbfz7" Dec 01 20:14:26 crc kubenswrapper[4802]: E1201 20:14:26.409766 4802 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 20:14:26 crc kubenswrapper[4802]: E1201 20:14:26.409852 4802 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b40759e9-9a00-445c-964e-09f1d539d85e-cert podName:b40759e9-9a00-445c-964e-09f1d539d85e nodeName:}" failed. No retries permitted until 2025-12-01 20:14:42.409826314 +0000 UTC m=+1103.972385965 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b40759e9-9a00-445c-964e-09f1d539d85e-cert") pod "infra-operator-controller-manager-57548d458d-jbfz7" (UID: "b40759e9-9a00-445c-964e-09f1d539d85e") : secret "infra-operator-webhook-server-cert" not found Dec 01 20:14:26 crc kubenswrapper[4802]: I1201 20:14:26.817560 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d12b9eb3-946b-4578-8630-4cb6643ab36f-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4552rv\" (UID: \"d12b9eb3-946b-4578-8630-4cb6643ab36f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4552rv" Dec 01 20:14:26 crc kubenswrapper[4802]: I1201 20:14:26.825297 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d12b9eb3-946b-4578-8630-4cb6643ab36f-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4552rv\" (UID: \"d12b9eb3-946b-4578-8630-4cb6643ab36f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4552rv" Dec 01 20:14:26 crc kubenswrapper[4802]: I1201 20:14:26.968929 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-z29b9" Dec 01 20:14:26 crc kubenswrapper[4802]: I1201 20:14:26.976564 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4552rv" Dec 01 20:14:27 crc kubenswrapper[4802]: I1201 20:14:27.121908 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-metrics-certs\") pod \"openstack-operator-controller-manager-79774867dd-5sjpr\" (UID: \"aebbca29-71df-4bef-8108-66b226259a58\") " pod="openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr" Dec 01 20:14:27 crc kubenswrapper[4802]: I1201 20:14:27.121985 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-webhook-certs\") pod \"openstack-operator-controller-manager-79774867dd-5sjpr\" (UID: \"aebbca29-71df-4bef-8108-66b226259a58\") " pod="openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr" Dec 01 20:14:27 crc kubenswrapper[4802]: I1201 20:14:27.129130 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-webhook-certs\") pod \"openstack-operator-controller-manager-79774867dd-5sjpr\" (UID: \"aebbca29-71df-4bef-8108-66b226259a58\") " pod="openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr" Dec 01 20:14:27 crc kubenswrapper[4802]: I1201 20:14:27.129464 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aebbca29-71df-4bef-8108-66b226259a58-metrics-certs\") pod \"openstack-operator-controller-manager-79774867dd-5sjpr\" (UID: \"aebbca29-71df-4bef-8108-66b226259a58\") " pod="openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr" Dec 01 20:14:27 crc kubenswrapper[4802]: I1201 20:14:27.301787 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-8h28j" Dec 01 20:14:27 crc kubenswrapper[4802]: I1201 20:14:27.309549 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr" Dec 01 20:14:28 crc kubenswrapper[4802]: I1201 20:14:28.089400 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:14:28 crc kubenswrapper[4802]: I1201 20:14:28.089495 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:14:34 crc kubenswrapper[4802]: E1201 20:14:34.942662 4802 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621" Dec 01 20:14:34 crc kubenswrapper[4802]: E1201 20:14:34.943720 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4dbvh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-r89vq_openstack-operators(97d3762b-15ce-45aa-9767-5be47c85e039): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 20:14:36 crc kubenswrapper[4802]: E1201 20:14:36.331031 4802 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f" Dec 01 20:14:36 crc kubenswrapper[4802]: E1201 20:14:36.331735 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cddv7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-45m8m_openstack-operators(088be214-85a6-4cb1-9e02-fcde44abb492): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 20:14:38 crc kubenswrapper[4802]: E1201 20:14:38.204338 4802 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 01 20:14:38 crc kubenswrapper[4802]: E1201 20:14:38.204630 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7cxd4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-cswfs_openstack-operators(e58c799d-fcaa-4d9b-aa6c-c8947774bd2e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 20:14:38 crc kubenswrapper[4802]: E1201 20:14:38.996618 4802 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7" Dec 01 20:14:38 crc kubenswrapper[4802]: E1201 20:14:38.997517 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d22fw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-p475v_openstack-operators(6973effc-3f05-43cd-ba03-b9efe3b6db1d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 20:14:40 crc kubenswrapper[4802]: E1201 20:14:40.917772 4802 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 01 20:14:40 crc kubenswrapper[4802]: E1201 20:14:40.918008 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fz47n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-mnwdn_openstack-operators(a3350b6c-2091-4a61-a78e-5a1bcdfd11cf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 20:14:40 crc kubenswrapper[4802]: E1201 20:14:40.919292 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mnwdn" podUID="a3350b6c-2091-4a61-a78e-5a1bcdfd11cf" Dec 01 20:14:41 crc kubenswrapper[4802]: E1201 20:14:41.341332 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mnwdn" podUID="a3350b6c-2091-4a61-a78e-5a1bcdfd11cf" Dec 01 20:14:41 crc kubenswrapper[4802]: E1201 20:14:41.362478 4802 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 01 20:14:41 crc kubenswrapper[4802]: E1201 20:14:41.362892 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-99z68,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-vb97q_openstack-operators(8e5dddc5-34ff-4a71-a626-3c9cea7ef30f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 20:14:42 crc kubenswrapper[4802]: E1201 20:14:42.329947 4802 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:986861e5a0a9954f63581d9d55a30f8057883cefea489415d76257774526eea3" Dec 01 20:14:42 crc kubenswrapper[4802]: E1201 20:14:42.330311 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:986861e5a0a9954f63581d9d55a30f8057883cefea489415d76257774526eea3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f92lc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-546d4bdf48-2nm2t_openstack-operators(0e1b6ed3-9b66-4279-9a9f-0685037df9c3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 20:14:42 crc kubenswrapper[4802]: I1201 20:14:42.496059 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b40759e9-9a00-445c-964e-09f1d539d85e-cert\") pod \"infra-operator-controller-manager-57548d458d-jbfz7\" (UID: \"b40759e9-9a00-445c-964e-09f1d539d85e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-jbfz7" Dec 01 20:14:42 crc kubenswrapper[4802]: I1201 20:14:42.521317 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b40759e9-9a00-445c-964e-09f1d539d85e-cert\") pod \"infra-operator-controller-manager-57548d458d-jbfz7\" (UID: \"b40759e9-9a00-445c-964e-09f1d539d85e\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-jbfz7" Dec 01 20:14:42 crc kubenswrapper[4802]: I1201 20:14:42.643607 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-7cwbk" Dec 01 20:14:42 crc kubenswrapper[4802]: I1201 20:14:42.651502 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-jbfz7" Dec 01 20:14:46 crc kubenswrapper[4802]: E1201 20:14:46.930887 4802 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 01 20:14:46 crc kubenswrapper[4802]: E1201 20:14:46.932154 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pw7x8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-djvhl_openstack-operators(c7839b31-af95-4d33-a954-9615ea0c87a6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 20:14:46 crc kubenswrapper[4802]: I1201 20:14:46.936127 4802 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 20:14:47 crc kubenswrapper[4802]: I1201 20:14:47.670945 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4552rv"] Dec 01 20:14:47 crc kubenswrapper[4802]: W1201 20:14:47.739192 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd12b9eb3_946b_4578_8630_4cb6643ab36f.slice/crio-3a9e014976160ebeafa45e87426b526c8f24d9570f964f49736314348cda351b WatchSource:0}: Error finding container 3a9e014976160ebeafa45e87426b526c8f24d9570f964f49736314348cda351b: Status 404 returned error can't find the container with id 3a9e014976160ebeafa45e87426b526c8f24d9570f964f49736314348cda351b Dec 01 20:14:47 crc kubenswrapper[4802]: I1201 20:14:47.740855 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr"] Dec 01 20:14:47 crc kubenswrapper[4802]: W1201 20:14:47.888005 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaebbca29_71df_4bef_8108_66b226259a58.slice/crio-2a949205f9be534f8681606185970ee630c21fff07f1b75b4ba0411af909c96a WatchSource:0}: Error finding container 2a949205f9be534f8681606185970ee630c21fff07f1b75b4ba0411af909c96a: Status 404 returned error can't find the container with id 2a949205f9be534f8681606185970ee630c21fff07f1b75b4ba0411af909c96a Dec 01 20:14:47 crc kubenswrapper[4802]: I1201 20:14:47.956860 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-jbfz7"] Dec 01 20:14:48 crc kubenswrapper[4802]: I1201 20:14:48.497970 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kmtdj" event={"ID":"e5c436a3-2237-4f02-a9fc-b2aae90ce3b1","Type":"ContainerStarted","Data":"7c3951f40c561a40bf72e0cbe26610e974be26b705a41dc4d6f0c323c2719984"} Dec 01 20:14:48 crc kubenswrapper[4802]: I1201 20:14:48.501652 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-qbvvz" event={"ID":"b01ea1d5-0409-4c32-bb34-1b88253ceb05","Type":"ContainerStarted","Data":"74b15771f667c2fc29ab81c3b363e6f92e7e85ad0ecadb97b4614fe14fa7f963"} Dec 01 20:14:48 crc kubenswrapper[4802]: I1201 20:14:48.503063 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-n7mjb" event={"ID":"e067c10a-e5d4-4e57-bf14-3b0bfc8ac069","Type":"ContainerStarted","Data":"a6430c58940abeb1b19ea2a98617e2105088f8907ef4102380a93892c44f814e"} Dec 01 20:14:48 crc kubenswrapper[4802]: I1201 20:14:48.504670 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-6d8qg" event={"ID":"18526d53-2d4c-4c40-885c-c83b3b378260","Type":"ContainerStarted","Data":"205482a16402f9a387ae6155f81c5db6cbc6318b1aef25a14eb3f4def81ec975"} Dec 01 20:14:48 crc kubenswrapper[4802]: I1201 20:14:48.509315 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr" event={"ID":"aebbca29-71df-4bef-8108-66b226259a58","Type":"ContainerStarted","Data":"2a949205f9be534f8681606185970ee630c21fff07f1b75b4ba0411af909c96a"} Dec 01 20:14:48 crc kubenswrapper[4802]: I1201 20:14:48.527358 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-776c976b46-x7bkb" event={"ID":"eb553ce8-f696-4c6b-a745-aa1faa5f9356","Type":"ContainerStarted","Data":"86eedf0399c1e25918b8d5b495374d73883e53d7f43791003162516ec93fc628"} Dec 01 20:14:48 crc kubenswrapper[4802]: I1201 20:14:48.530018 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-7stkq" event={"ID":"1891b769-8e7e-4375-b3ea-421a23fb7af4","Type":"ContainerStarted","Data":"fe89d1b0a3334860705e2b88cd570ae01c017435c0678bda4c334f1f491b6d2c"} Dec 01 20:14:48 crc kubenswrapper[4802]: I1201 20:14:48.531516 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4552rv" event={"ID":"d12b9eb3-946b-4578-8630-4cb6643ab36f","Type":"ContainerStarted","Data":"3a9e014976160ebeafa45e87426b526c8f24d9570f964f49736314348cda351b"} Dec 01 20:14:48 crc kubenswrapper[4802]: I1201 20:14:48.533216 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8b6ht" event={"ID":"369e7da7-22d9-470f-9ad0-48472ceffde4","Type":"ContainerStarted","Data":"2de12c0d70b18014706a2c00fdca425fdfdb7a3d4c43147371c32894013cd168"} Dec 01 20:14:48 crc kubenswrapper[4802]: I1201 20:14:48.535710 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-jbfz7" event={"ID":"b40759e9-9a00-445c-964e-09f1d539d85e","Type":"ContainerStarted","Data":"f132856a2bc59a03cc6f688d5976aed46996c4c8628a500949766c4f38a417e3"} Dec 01 20:14:49 crc kubenswrapper[4802]: I1201 20:14:49.784362 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-trddt" event={"ID":"bb89e7bb-899f-4f3e-80cd-833fbc74db85","Type":"ContainerStarted","Data":"54e8441d865b2242db9ab869e938ecd475282afc5c6afb0b2d1ac5d0347303c3"} Dec 01 20:14:51 crc kubenswrapper[4802]: E1201 20:14:51.514469 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-2nm2t" podUID="0e1b6ed3-9b66-4279-9a9f-0685037df9c3" Dec 01 20:14:51 crc kubenswrapper[4802]: E1201 20:14:51.543415 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-djvhl" podUID="c7839b31-af95-4d33-a954-9615ea0c87a6" Dec 01 20:14:51 crc kubenswrapper[4802]: I1201 20:14:51.819498 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-djvhl" event={"ID":"c7839b31-af95-4d33-a954-9615ea0c87a6","Type":"ContainerStarted","Data":"c7b350b48b3405793202273235f282fbfb18f584d27791b5d55be2bfda0e15f0"} Dec 01 20:14:51 crc kubenswrapper[4802]: E1201 20:14:51.830485 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-djvhl" podUID="c7839b31-af95-4d33-a954-9615ea0c87a6" Dec 01 20:14:51 crc kubenswrapper[4802]: I1201 20:14:51.837674 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wx6ct" event={"ID":"92132c51-643c-4442-adf2-897bd2825fdf","Type":"ContainerStarted","Data":"7b432270f902e2ca97458bb37016d6ad7b91e74e9adb4150ed0ff8e7eb3023d1"} Dec 01 20:14:51 crc kubenswrapper[4802]: I1201 20:14:51.837740 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wx6ct" event={"ID":"92132c51-643c-4442-adf2-897bd2825fdf","Type":"ContainerStarted","Data":"5345966ad335145614d2011dffc193e5f97511f0cb42ab55f323b1dbf6e7a8cc"} Dec 01 20:14:51 crc kubenswrapper[4802]: I1201 20:14:51.838093 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wx6ct" Dec 01 20:14:51 crc kubenswrapper[4802]: I1201 20:14:51.863373 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kmtdj" event={"ID":"e5c436a3-2237-4f02-a9fc-b2aae90ce3b1","Type":"ContainerStarted","Data":"24b60c1c8163cddcab72c0142029082add2c7bb5b7c72f50cb11d11e2d111c65"} Dec 01 20:14:51 crc kubenswrapper[4802]: I1201 20:14:51.864129 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kmtdj" Dec 01 20:14:51 crc kubenswrapper[4802]: I1201 20:14:51.884421 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-2nm2t" event={"ID":"0e1b6ed3-9b66-4279-9a9f-0685037df9c3","Type":"ContainerStarted","Data":"253669d06d10eb96915f2b8d5930e9643534d40e58624ceecd423c611dbe687c"} Dec 01 20:14:51 crc kubenswrapper[4802]: E1201 20:14:51.889663 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:986861e5a0a9954f63581d9d55a30f8057883cefea489415d76257774526eea3\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-2nm2t" podUID="0e1b6ed3-9b66-4279-9a9f-0685037df9c3" Dec 01 20:14:51 crc kubenswrapper[4802]: I1201 20:14:51.900447 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-n7mjb" event={"ID":"e067c10a-e5d4-4e57-bf14-3b0bfc8ac069","Type":"ContainerStarted","Data":"e51c53e4e26b7d30301a2d85f1b5d65255ab5afabd97559cf6f865a6657cf37c"} Dec 01 20:14:51 crc kubenswrapper[4802]: I1201 20:14:51.901070 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-n7mjb" Dec 01 20:14:51 crc kubenswrapper[4802]: I1201 20:14:51.902551 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-lfh62" event={"ID":"0934b0fd-8a48-4dee-b668-08c7b631551f","Type":"ContainerStarted","Data":"ed176fb9ce6fda8cf9c0d1c0c53b13570b6fa5508f01bcd659839863eed27241"} Dec 01 20:14:51 crc kubenswrapper[4802]: I1201 20:14:51.902585 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-lfh62" event={"ID":"0934b0fd-8a48-4dee-b668-08c7b631551f","Type":"ContainerStarted","Data":"f7e5aac931f6a0140cdd12bc61094f0e93aeb4b379793ac7d20bebeae21fdc04"} Dec 01 20:14:51 crc kubenswrapper[4802]: I1201 20:14:51.902924 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-lfh62" Dec 01 20:14:51 crc kubenswrapper[4802]: I1201 20:14:51.905317 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-trddt" event={"ID":"bb89e7bb-899f-4f3e-80cd-833fbc74db85","Type":"ContainerStarted","Data":"f5d29c5635992ff3a072609aa8833069f3743489d065b8f866eb81bf01ee06fb"} Dec 01 20:14:51 crc kubenswrapper[4802]: I1201 20:14:51.905675 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-trddt" Dec 01 20:14:51 crc kubenswrapper[4802]: I1201 20:14:51.906596 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr" event={"ID":"aebbca29-71df-4bef-8108-66b226259a58","Type":"ContainerStarted","Data":"d233873de866d14c75241259ac87ffd44b0e3480990bf53273ab107170487865"} Dec 01 20:14:51 crc kubenswrapper[4802]: I1201 20:14:51.906966 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr" Dec 01 20:14:51 crc kubenswrapper[4802]: I1201 20:14:51.963702 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wx6ct" podStartSLOduration=7.999652038 podStartE2EDuration="41.963674652s" podCreationTimestamp="2025-12-01 20:14:10 +0000 UTC" firstStartedPulling="2025-12-01 20:14:13.343258482 +0000 UTC m=+1074.905818123" lastFinishedPulling="2025-12-01 20:14:47.307281095 +0000 UTC m=+1108.869840737" observedRunningTime="2025-12-01 20:14:51.962493474 +0000 UTC m=+1113.525053115" watchObservedRunningTime="2025-12-01 20:14:51.963674652 +0000 UTC m=+1113.526234293" Dec 01 20:14:51 crc kubenswrapper[4802]: I1201 20:14:51.996335 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-trddt" podStartSLOduration=4.176016399 podStartE2EDuration="41.996315214s" podCreationTimestamp="2025-12-01 20:14:10 +0000 UTC" firstStartedPulling="2025-12-01 20:14:13.200502658 +0000 UTC m=+1074.763062299" lastFinishedPulling="2025-12-01 20:14:51.020801473 +0000 UTC m=+1112.583361114" observedRunningTime="2025-12-01 20:14:51.98977757 +0000 UTC m=+1113.552337221" watchObservedRunningTime="2025-12-01 20:14:51.996315214 +0000 UTC m=+1113.558874855" Dec 01 20:14:52 crc kubenswrapper[4802]: I1201 20:14:52.109071 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr" podStartSLOduration=41.109050208 podStartE2EDuration="41.109050208s" podCreationTimestamp="2025-12-01 20:14:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:14:52.081753942 +0000 UTC m=+1113.644313583" watchObservedRunningTime="2025-12-01 20:14:52.109050208 +0000 UTC m=+1113.671609859" Dec 01 20:14:52 crc kubenswrapper[4802]: I1201 20:14:52.134175 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-lfh62" podStartSLOduration=8.012452539 podStartE2EDuration="42.134157134s" podCreationTimestamp="2025-12-01 20:14:10 +0000 UTC" firstStartedPulling="2025-12-01 20:14:13.200805518 +0000 UTC m=+1074.763365159" lastFinishedPulling="2025-12-01 20:14:47.322510113 +0000 UTC m=+1108.885069754" observedRunningTime="2025-12-01 20:14:52.128394314 +0000 UTC m=+1113.690953955" watchObservedRunningTime="2025-12-01 20:14:52.134157134 +0000 UTC m=+1113.696716775" Dec 01 20:14:52 crc kubenswrapper[4802]: I1201 20:14:52.153037 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-n7mjb" podStartSLOduration=3.683187855 podStartE2EDuration="42.153014245s" podCreationTimestamp="2025-12-01 20:14:10 +0000 UTC" firstStartedPulling="2025-12-01 20:14:12.606251245 +0000 UTC m=+1074.168810886" lastFinishedPulling="2025-12-01 20:14:51.076077635 +0000 UTC m=+1112.638637276" observedRunningTime="2025-12-01 20:14:52.143522808 +0000 UTC m=+1113.706082449" watchObservedRunningTime="2025-12-01 20:14:52.153014245 +0000 UTC m=+1113.715573886" Dec 01 20:14:52 crc kubenswrapper[4802]: E1201 20:14:52.207370 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-vb97q" podUID="8e5dddc5-34ff-4a71-a626-3c9cea7ef30f" Dec 01 20:14:52 crc kubenswrapper[4802]: E1201 20:14:52.256372 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-p475v" podUID="6973effc-3f05-43cd-ba03-b9efe3b6db1d" Dec 01 20:14:52 crc kubenswrapper[4802]: E1201 20:14:52.638903 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r89vq" podUID="97d3762b-15ce-45aa-9767-5be47c85e039" Dec 01 20:14:52 crc kubenswrapper[4802]: E1201 20:14:52.640702 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5ftkv" podUID="8626cbeb-8604-4371-b936-99cab8d76742" Dec 01 20:14:52 crc kubenswrapper[4802]: E1201 20:14:52.649404 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-45m8m" podUID="088be214-85a6-4cb1-9e02-fcde44abb492" Dec 01 20:14:52 crc kubenswrapper[4802]: E1201 20:14:52.650010 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cswfs" podUID="e58c799d-fcaa-4d9b-aa6c-c8947774bd2e" Dec 01 20:14:52 crc kubenswrapper[4802]: E1201 20:14:52.673189 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-jcmlp" podUID="74aa06c0-a03f-4719-b751-a77ab3d472f2" Dec 01 20:14:52 crc kubenswrapper[4802]: I1201 20:14:52.915268 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-45m8m" event={"ID":"088be214-85a6-4cb1-9e02-fcde44abb492","Type":"ContainerStarted","Data":"db45f82b6aca9a0c6ba80da5af3a384cb357d89c1f8d534ca42be039b45425c7"} Dec 01 20:14:52 crc kubenswrapper[4802]: I1201 20:14:52.920485 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-qbvvz" event={"ID":"b01ea1d5-0409-4c32-bb34-1b88253ceb05","Type":"ContainerStarted","Data":"73cc46da16f45041364f323943da7e1b245375e578aa68f06f32292fdd174db9"} Dec 01 20:14:52 crc kubenswrapper[4802]: I1201 20:14:52.920545 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-qbvvz" Dec 01 20:14:52 crc kubenswrapper[4802]: I1201 20:14:52.922130 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5ftkv" event={"ID":"8626cbeb-8604-4371-b936-99cab8d76742","Type":"ContainerStarted","Data":"035170cabeb1938863205d691ec8ce912e51e6ecf40bb64b8829d0b04139e289"} Dec 01 20:14:52 crc kubenswrapper[4802]: I1201 20:14:52.924421 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-vb97q" event={"ID":"8e5dddc5-34ff-4a71-a626-3c9cea7ef30f","Type":"ContainerStarted","Data":"e2706190a4f56e7d4959dd51d02a6d67d8d0ce63d6e18432c7fc2067d3fd6e21"} Dec 01 20:14:52 crc kubenswrapper[4802]: I1201 20:14:52.925144 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-qbvvz" Dec 01 20:14:52 crc kubenswrapper[4802]: I1201 20:14:52.927367 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-jcmlp" event={"ID":"74aa06c0-a03f-4719-b751-a77ab3d472f2","Type":"ContainerStarted","Data":"ba4a297f71fbe146bb7f221b8abf78ee9bea1faf3af641dd885c7a5274fbc297"} Dec 01 20:14:52 crc kubenswrapper[4802]: I1201 20:14:52.929228 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r89vq" event={"ID":"97d3762b-15ce-45aa-9767-5be47c85e039","Type":"ContainerStarted","Data":"904e85dc0d6b1a47266553945a9bd2c278a5a57fe1d8abfed4a2c467ff9aa41e"} Dec 01 20:14:52 crc kubenswrapper[4802]: I1201 20:14:52.934455 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-6d8qg" event={"ID":"18526d53-2d4c-4c40-885c-c83b3b378260","Type":"ContainerStarted","Data":"6fc7ed1c5ecaee2e30e68a2779a3fef33688de035f7d7b597f33194b45345705"} Dec 01 20:14:52 crc kubenswrapper[4802]: I1201 20:14:52.935116 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-6d8qg" Dec 01 20:14:52 crc kubenswrapper[4802]: I1201 20:14:52.946543 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-776c976b46-x7bkb" event={"ID":"eb553ce8-f696-4c6b-a745-aa1faa5f9356","Type":"ContainerStarted","Data":"b357c6746edb3b92a9997ca880cb7b80d32efdb212cb4901672c617ba805a33b"} Dec 01 20:14:52 crc kubenswrapper[4802]: I1201 20:14:52.946824 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-776c976b46-x7bkb" Dec 01 20:14:52 crc kubenswrapper[4802]: I1201 20:14:52.955769 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-7stkq" event={"ID":"1891b769-8e7e-4375-b3ea-421a23fb7af4","Type":"ContainerStarted","Data":"81152e80356024ea463b24cc5c1948cb6f05a4783729ce6e93feb5b99f93de27"} Dec 01 20:14:52 crc kubenswrapper[4802]: I1201 20:14:52.956411 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-7stkq" Dec 01 20:14:52 crc kubenswrapper[4802]: I1201 20:14:52.961915 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kmtdj" podStartSLOduration=4.563596716 podStartE2EDuration="42.961880434s" podCreationTimestamp="2025-12-01 20:14:10 +0000 UTC" firstStartedPulling="2025-12-01 20:14:12.623604299 +0000 UTC m=+1074.186163940" lastFinishedPulling="2025-12-01 20:14:51.021888007 +0000 UTC m=+1112.584447658" observedRunningTime="2025-12-01 20:14:52.188653173 +0000 UTC m=+1113.751212824" watchObservedRunningTime="2025-12-01 20:14:52.961880434 +0000 UTC m=+1114.524440075" Dec 01 20:14:52 crc kubenswrapper[4802]: I1201 20:14:52.962471 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-7stkq" Dec 01 20:14:52 crc kubenswrapper[4802]: I1201 20:14:52.984520 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8b6ht" event={"ID":"369e7da7-22d9-470f-9ad0-48472ceffde4","Type":"ContainerStarted","Data":"0714d3857cfc10601c103bceadf5b7b31e36340f57bc045a817d717d9a28d2b6"} Dec 01 20:14:52 crc kubenswrapper[4802]: I1201 20:14:52.985242 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8b6ht" Dec 01 20:14:52 crc kubenswrapper[4802]: I1201 20:14:52.986896 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cswfs" event={"ID":"e58c799d-fcaa-4d9b-aa6c-c8947774bd2e","Type":"ContainerStarted","Data":"d07366243f054635e53058cf8647949d905f72942d27423b369698583d829b6b"} Dec 01 20:14:52 crc kubenswrapper[4802]: I1201 20:14:52.989392 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-p475v" event={"ID":"6973effc-3f05-43cd-ba03-b9efe3b6db1d","Type":"ContainerStarted","Data":"51dff52f7277bcb6534ead38e701581644ee425038ff90287f9328ca1b1810fc"} Dec 01 20:14:52 crc kubenswrapper[4802]: E1201 20:14:52.991864 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-djvhl" podUID="c7839b31-af95-4d33-a954-9615ea0c87a6" Dec 01 20:14:52 crc kubenswrapper[4802]: I1201 20:14:52.996063 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kmtdj" Dec 01 20:14:53 crc kubenswrapper[4802]: I1201 20:14:53.049756 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-qbvvz" podStartSLOduration=3.5398592840000003 podStartE2EDuration="43.049738708s" podCreationTimestamp="2025-12-01 20:14:10 +0000 UTC" firstStartedPulling="2025-12-01 20:14:12.133098337 +0000 UTC m=+1073.695657978" lastFinishedPulling="2025-12-01 20:14:51.642977761 +0000 UTC m=+1113.205537402" observedRunningTime="2025-12-01 20:14:53.047516948 +0000 UTC m=+1114.610076589" watchObservedRunningTime="2025-12-01 20:14:53.049738708 +0000 UTC m=+1114.612298349" Dec 01 20:14:53 crc kubenswrapper[4802]: I1201 20:14:53.103152 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-6d8qg" podStartSLOduration=3.3681724219999998 podStartE2EDuration="43.103134031s" podCreationTimestamp="2025-12-01 20:14:10 +0000 UTC" firstStartedPulling="2025-12-01 20:14:11.805344665 +0000 UTC m=+1073.367904306" lastFinishedPulling="2025-12-01 20:14:51.540306274 +0000 UTC m=+1113.102865915" observedRunningTime="2025-12-01 20:14:53.095554164 +0000 UTC m=+1114.658113815" watchObservedRunningTime="2025-12-01 20:14:53.103134031 +0000 UTC m=+1114.665693672" Dec 01 20:14:53 crc kubenswrapper[4802]: I1201 20:14:53.231738 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-7stkq" podStartSLOduration=5.070273154 podStartE2EDuration="43.23169973s" podCreationTimestamp="2025-12-01 20:14:10 +0000 UTC" firstStartedPulling="2025-12-01 20:14:13.163680084 +0000 UTC m=+1074.726239725" lastFinishedPulling="2025-12-01 20:14:51.32510666 +0000 UTC m=+1112.887666301" observedRunningTime="2025-12-01 20:14:53.220022204 +0000 UTC m=+1114.782581845" watchObservedRunningTime="2025-12-01 20:14:53.23169973 +0000 UTC m=+1114.794259361" Dec 01 20:14:53 crc kubenswrapper[4802]: I1201 20:14:53.298036 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-776c976b46-x7bkb" podStartSLOduration=4.129043547 podStartE2EDuration="43.297999068s" podCreationTimestamp="2025-12-01 20:14:10 +0000 UTC" firstStartedPulling="2025-12-01 20:14:12.130399682 +0000 UTC m=+1073.692959323" lastFinishedPulling="2025-12-01 20:14:51.299355203 +0000 UTC m=+1112.861914844" observedRunningTime="2025-12-01 20:14:53.284521075 +0000 UTC m=+1114.847080716" watchObservedRunningTime="2025-12-01 20:14:53.297999068 +0000 UTC m=+1114.860558709" Dec 01 20:14:53 crc kubenswrapper[4802]: I1201 20:14:53.334551 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8b6ht" podStartSLOduration=5.459083959 podStartE2EDuration="43.334533353s" podCreationTimestamp="2025-12-01 20:14:10 +0000 UTC" firstStartedPulling="2025-12-01 20:14:13.200670543 +0000 UTC m=+1074.763230184" lastFinishedPulling="2025-12-01 20:14:51.076119937 +0000 UTC m=+1112.638679578" observedRunningTime="2025-12-01 20:14:53.329098552 +0000 UTC m=+1114.891658193" watchObservedRunningTime="2025-12-01 20:14:53.334533353 +0000 UTC m=+1114.897092994" Dec 01 20:14:54 crc kubenswrapper[4802]: I1201 20:14:54.000041 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-45m8m" event={"ID":"088be214-85a6-4cb1-9e02-fcde44abb492","Type":"ContainerStarted","Data":"fa2fbc5407a454a7a509dc21a9d75eae0565fe0a4895b916358ea76a023f4dc3"} Dec 01 20:14:54 crc kubenswrapper[4802]: I1201 20:14:54.006782 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-n7mjb" Dec 01 20:14:54 crc kubenswrapper[4802]: I1201 20:14:54.006965 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-6d8qg" Dec 01 20:14:54 crc kubenswrapper[4802]: I1201 20:14:54.007030 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-776c976b46-x7bkb" Dec 01 20:14:54 crc kubenswrapper[4802]: I1201 20:14:54.007058 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-trddt" Dec 01 20:14:54 crc kubenswrapper[4802]: I1201 20:14:54.007261 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-8b6ht" Dec 01 20:14:54 crc kubenswrapper[4802]: I1201 20:14:54.121493 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-45m8m" podStartSLOduration=3.876342218 podStartE2EDuration="44.121472675s" podCreationTimestamp="2025-12-01 20:14:10 +0000 UTC" firstStartedPulling="2025-12-01 20:14:13.155183558 +0000 UTC m=+1074.717743199" lastFinishedPulling="2025-12-01 20:14:53.400314015 +0000 UTC m=+1114.962873656" observedRunningTime="2025-12-01 20:14:54.042637454 +0000 UTC m=+1115.605197095" watchObservedRunningTime="2025-12-01 20:14:54.121472675 +0000 UTC m=+1115.684032316" Dec 01 20:14:55 crc kubenswrapper[4802]: I1201 20:14:55.016112 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-45m8m" Dec 01 20:14:57 crc kubenswrapper[4802]: I1201 20:14:57.317578 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-79774867dd-5sjpr" Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.081501 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4552rv" event={"ID":"d12b9eb3-946b-4578-8630-4cb6643ab36f","Type":"ContainerStarted","Data":"37aad9b710e2e11d849107ce33b5e5f6dcb2327d5fcf36b1cfb818075dac11e4"} Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.082081 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4552rv" event={"ID":"d12b9eb3-946b-4578-8630-4cb6643ab36f","Type":"ContainerStarted","Data":"e93c47ffeb789b49ca987865ecef62fed6df86ecd3044ce67f9826998cdb5f42"} Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.082188 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4552rv" Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.089483 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.089538 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.096016 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cswfs" event={"ID":"e58c799d-fcaa-4d9b-aa6c-c8947774bd2e","Type":"ContainerStarted","Data":"4f9aaa4709b91933a6089880af64f811ecb8c709478efd1cfbf5fff99702a445"} Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.096179 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cswfs" Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.101680 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-p475v" event={"ID":"6973effc-3f05-43cd-ba03-b9efe3b6db1d","Type":"ContainerStarted","Data":"f4a47736eb19dc5cbab43dbcd8163ef9f2880ab86d515ee5fe247192912af0f1"} Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.102816 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-p475v" Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.106476 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-vb97q" event={"ID":"8e5dddc5-34ff-4a71-a626-3c9cea7ef30f","Type":"ContainerStarted","Data":"a93fc3c23f0ec4f1548565916c2171acb7a88e5de9037ccf6857515da6cad49d"} Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.106926 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-vb97q" Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.109886 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mnwdn" event={"ID":"a3350b6c-2091-4a61-a78e-5a1bcdfd11cf","Type":"ContainerStarted","Data":"1f429daa442992b090757cbc673ae450c2b772b11f2c29f787249c52d4ba884b"} Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.117788 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-jcmlp" event={"ID":"74aa06c0-a03f-4719-b751-a77ab3d472f2","Type":"ContainerStarted","Data":"5a9153982982af1781c3e31817907c113437827522cdd5c773fe6871a13253c1"} Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.118371 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-jcmlp" Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.126026 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r89vq" event={"ID":"97d3762b-15ce-45aa-9767-5be47c85e039","Type":"ContainerStarted","Data":"bfd55ea0fcc6b28899d4d4d8a448be36609fb178a2cccfcb9d89b2715152bf9a"} Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.126248 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r89vq" Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.130458 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5ftkv" event={"ID":"8626cbeb-8604-4371-b936-99cab8d76742","Type":"ContainerStarted","Data":"9e4b58863b657440e70a044494a68c16fda40f025d7c37876fde4f364f8bd29e"} Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.132312 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5ftkv" Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.135356 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4552rv" podStartSLOduration=38.653934207 podStartE2EDuration="48.135333834s" podCreationTimestamp="2025-12-01 20:14:10 +0000 UTC" firstStartedPulling="2025-12-01 20:14:47.746343815 +0000 UTC m=+1109.308903456" lastFinishedPulling="2025-12-01 20:14:57.227743432 +0000 UTC m=+1118.790303083" observedRunningTime="2025-12-01 20:14:58.132421693 +0000 UTC m=+1119.694981334" watchObservedRunningTime="2025-12-01 20:14:58.135333834 +0000 UTC m=+1119.697893475" Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.137610 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-2nm2t" event={"ID":"0e1b6ed3-9b66-4279-9a9f-0685037df9c3","Type":"ContainerStarted","Data":"bb4698b18016f7eda45faa23a555f046370b4b7131d5419644d8da334be08458"} Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.138482 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-2nm2t" Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.151243 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-jbfz7" event={"ID":"b40759e9-9a00-445c-964e-09f1d539d85e","Type":"ContainerStarted","Data":"cb4381ab6523f1240cc17c714225b1af2f11bfdf33978523096cc278f2faebb2"} Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.186697 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r89vq" podStartSLOduration=4.23477136 podStartE2EDuration="48.186676114s" podCreationTimestamp="2025-12-01 20:14:10 +0000 UTC" firstStartedPulling="2025-12-01 20:14:13.194074196 +0000 UTC m=+1074.756633837" lastFinishedPulling="2025-12-01 20:14:57.14597895 +0000 UTC m=+1118.708538591" observedRunningTime="2025-12-01 20:14:58.182859454 +0000 UTC m=+1119.745419105" watchObservedRunningTime="2025-12-01 20:14:58.186676114 +0000 UTC m=+1119.749235755" Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.247157 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5ftkv" podStartSLOduration=3.880383644 podStartE2EDuration="48.247123337s" podCreationTimestamp="2025-12-01 20:14:10 +0000 UTC" firstStartedPulling="2025-12-01 20:14:12.781241319 +0000 UTC m=+1074.343800960" lastFinishedPulling="2025-12-01 20:14:57.147981012 +0000 UTC m=+1118.710540653" observedRunningTime="2025-12-01 20:14:58.244671191 +0000 UTC m=+1119.807230842" watchObservedRunningTime="2025-12-01 20:14:58.247123337 +0000 UTC m=+1119.809682978" Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.251136 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-vb97q" podStartSLOduration=4.194393135 podStartE2EDuration="48.251122693s" podCreationTimestamp="2025-12-01 20:14:10 +0000 UTC" firstStartedPulling="2025-12-01 20:14:13.154445024 +0000 UTC m=+1074.717004675" lastFinishedPulling="2025-12-01 20:14:57.211174582 +0000 UTC m=+1118.773734233" observedRunningTime="2025-12-01 20:14:58.211676567 +0000 UTC m=+1119.774236208" watchObservedRunningTime="2025-12-01 20:14:58.251122693 +0000 UTC m=+1119.813682344" Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.310046 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-p475v" podStartSLOduration=6.219297694 podStartE2EDuration="48.310024049s" podCreationTimestamp="2025-12-01 20:14:10 +0000 UTC" firstStartedPulling="2025-12-01 20:14:13.130954949 +0000 UTC m=+1074.693514590" lastFinishedPulling="2025-12-01 20:14:55.221681314 +0000 UTC m=+1116.784240945" observedRunningTime="2025-12-01 20:14:58.308270095 +0000 UTC m=+1119.870829756" watchObservedRunningTime="2025-12-01 20:14:58.310024049 +0000 UTC m=+1119.872583690" Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.310358 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cswfs" podStartSLOduration=4.212407569 podStartE2EDuration="48.31035085s" podCreationTimestamp="2025-12-01 20:14:10 +0000 UTC" firstStartedPulling="2025-12-01 20:14:13.132524797 +0000 UTC m=+1074.695084438" lastFinishedPulling="2025-12-01 20:14:57.230468078 +0000 UTC m=+1118.793027719" observedRunningTime="2025-12-01 20:14:58.276803308 +0000 UTC m=+1119.839362969" watchObservedRunningTime="2025-12-01 20:14:58.31035085 +0000 UTC m=+1119.872910491" Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.351188 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-jcmlp" podStartSLOduration=3.867529322 podStartE2EDuration="48.351145708s" podCreationTimestamp="2025-12-01 20:14:10 +0000 UTC" firstStartedPulling="2025-12-01 20:14:12.654619511 +0000 UTC m=+1074.217179152" lastFinishedPulling="2025-12-01 20:14:57.138235897 +0000 UTC m=+1118.700795538" observedRunningTime="2025-12-01 20:14:58.342573969 +0000 UTC m=+1119.905133640" watchObservedRunningTime="2025-12-01 20:14:58.351145708 +0000 UTC m=+1119.913705349" Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.376756 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mnwdn" podStartSLOduration=3.462224769 podStartE2EDuration="47.37673584s" podCreationTimestamp="2025-12-01 20:14:11 +0000 UTC" firstStartedPulling="2025-12-01 20:14:13.342770896 +0000 UTC m=+1074.905330537" lastFinishedPulling="2025-12-01 20:14:57.257281947 +0000 UTC m=+1118.819841608" observedRunningTime="2025-12-01 20:14:58.373186678 +0000 UTC m=+1119.935746329" watchObservedRunningTime="2025-12-01 20:14:58.37673584 +0000 UTC m=+1119.939295481" Dec 01 20:14:58 crc kubenswrapper[4802]: I1201 20:14:58.410396 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-2nm2t" podStartSLOduration=4.228275768 podStartE2EDuration="48.410377834s" podCreationTimestamp="2025-12-01 20:14:10 +0000 UTC" firstStartedPulling="2025-12-01 20:14:13.044064526 +0000 UTC m=+1074.606624167" lastFinishedPulling="2025-12-01 20:14:57.226166582 +0000 UTC m=+1118.788726233" observedRunningTime="2025-12-01 20:14:58.399167013 +0000 UTC m=+1119.961726654" watchObservedRunningTime="2025-12-01 20:14:58.410377834 +0000 UTC m=+1119.972937475" Dec 01 20:15:00 crc kubenswrapper[4802]: I1201 20:15:00.179673 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410335-8p7k7"] Dec 01 20:15:00 crc kubenswrapper[4802]: I1201 20:15:00.180857 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410335-8p7k7" Dec 01 20:15:00 crc kubenswrapper[4802]: I1201 20:15:00.183287 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 20:15:00 crc kubenswrapper[4802]: I1201 20:15:00.183698 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 20:15:00 crc kubenswrapper[4802]: I1201 20:15:00.196785 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410335-8p7k7"] Dec 01 20:15:00 crc kubenswrapper[4802]: I1201 20:15:00.262528 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9knwv\" (UniqueName: \"kubernetes.io/projected/dc4bc05f-4541-4ffe-84b9-b3b54d244094-kube-api-access-9knwv\") pod \"collect-profiles-29410335-8p7k7\" (UID: \"dc4bc05f-4541-4ffe-84b9-b3b54d244094\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410335-8p7k7" Dec 01 20:15:00 crc kubenswrapper[4802]: I1201 20:15:00.262677 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc4bc05f-4541-4ffe-84b9-b3b54d244094-config-volume\") pod \"collect-profiles-29410335-8p7k7\" (UID: \"dc4bc05f-4541-4ffe-84b9-b3b54d244094\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410335-8p7k7" Dec 01 20:15:00 crc kubenswrapper[4802]: I1201 20:15:00.262735 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc4bc05f-4541-4ffe-84b9-b3b54d244094-secret-volume\") pod \"collect-profiles-29410335-8p7k7\" (UID: \"dc4bc05f-4541-4ffe-84b9-b3b54d244094\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410335-8p7k7" Dec 01 20:15:00 crc kubenswrapper[4802]: I1201 20:15:00.364009 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9knwv\" (UniqueName: \"kubernetes.io/projected/dc4bc05f-4541-4ffe-84b9-b3b54d244094-kube-api-access-9knwv\") pod \"collect-profiles-29410335-8p7k7\" (UID: \"dc4bc05f-4541-4ffe-84b9-b3b54d244094\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410335-8p7k7" Dec 01 20:15:00 crc kubenswrapper[4802]: I1201 20:15:00.364120 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc4bc05f-4541-4ffe-84b9-b3b54d244094-config-volume\") pod \"collect-profiles-29410335-8p7k7\" (UID: \"dc4bc05f-4541-4ffe-84b9-b3b54d244094\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410335-8p7k7" Dec 01 20:15:00 crc kubenswrapper[4802]: I1201 20:15:00.364168 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc4bc05f-4541-4ffe-84b9-b3b54d244094-secret-volume\") pod \"collect-profiles-29410335-8p7k7\" (UID: \"dc4bc05f-4541-4ffe-84b9-b3b54d244094\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410335-8p7k7" Dec 01 20:15:00 crc kubenswrapper[4802]: I1201 20:15:00.366004 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc4bc05f-4541-4ffe-84b9-b3b54d244094-config-volume\") pod \"collect-profiles-29410335-8p7k7\" (UID: \"dc4bc05f-4541-4ffe-84b9-b3b54d244094\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410335-8p7k7" Dec 01 20:15:00 crc kubenswrapper[4802]: I1201 20:15:00.372172 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc4bc05f-4541-4ffe-84b9-b3b54d244094-secret-volume\") pod \"collect-profiles-29410335-8p7k7\" (UID: \"dc4bc05f-4541-4ffe-84b9-b3b54d244094\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410335-8p7k7" Dec 01 20:15:00 crc kubenswrapper[4802]: I1201 20:15:00.385712 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9knwv\" (UniqueName: \"kubernetes.io/projected/dc4bc05f-4541-4ffe-84b9-b3b54d244094-kube-api-access-9knwv\") pod \"collect-profiles-29410335-8p7k7\" (UID: \"dc4bc05f-4541-4ffe-84b9-b3b54d244094\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410335-8p7k7" Dec 01 20:15:00 crc kubenswrapper[4802]: I1201 20:15:00.501443 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410335-8p7k7" Dec 01 20:15:01 crc kubenswrapper[4802]: I1201 20:15:01.032234 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410335-8p7k7"] Dec 01 20:15:01 crc kubenswrapper[4802]: I1201 20:15:01.181745 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-jbfz7" event={"ID":"b40759e9-9a00-445c-964e-09f1d539d85e","Type":"ContainerStarted","Data":"6f7a44dc14ada946ce78e11adee4e8d6861d14684ab9bf4b2e173254e0e99529"} Dec 01 20:15:01 crc kubenswrapper[4802]: I1201 20:15:01.182205 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-jbfz7" Dec 01 20:15:01 crc kubenswrapper[4802]: I1201 20:15:01.184328 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410335-8p7k7" event={"ID":"dc4bc05f-4541-4ffe-84b9-b3b54d244094","Type":"ContainerStarted","Data":"872ea8c60f04497256635647c7d851c29dd270639b958ee730bd453521e6951b"} Dec 01 20:15:01 crc kubenswrapper[4802]: I1201 20:15:01.213609 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-jbfz7" podStartSLOduration=42.0705016 podStartE2EDuration="51.213588064s" podCreationTimestamp="2025-12-01 20:14:10 +0000 UTC" firstStartedPulling="2025-12-01 20:14:48.08344245 +0000 UTC m=+1109.646002091" lastFinishedPulling="2025-12-01 20:14:57.226528904 +0000 UTC m=+1118.789088555" observedRunningTime="2025-12-01 20:15:01.206940125 +0000 UTC m=+1122.769499766" watchObservedRunningTime="2025-12-01 20:15:01.213588064 +0000 UTC m=+1122.776147705" Dec 01 20:15:01 crc kubenswrapper[4802]: I1201 20:15:01.474287 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-45m8m" Dec 01 20:15:01 crc kubenswrapper[4802]: I1201 20:15:01.530019 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-lfh62" Dec 01 20:15:01 crc kubenswrapper[4802]: I1201 20:15:01.625186 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wx6ct" Dec 01 20:15:02 crc kubenswrapper[4802]: I1201 20:15:02.194387 4802 generic.go:334] "Generic (PLEG): container finished" podID="dc4bc05f-4541-4ffe-84b9-b3b54d244094" containerID="6175fdd46891bd81af38fd579143a31ab1e715b0466c87fce1a1553120da1c14" exitCode=0 Dec 01 20:15:02 crc kubenswrapper[4802]: I1201 20:15:02.194464 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410335-8p7k7" event={"ID":"dc4bc05f-4541-4ffe-84b9-b3b54d244094","Type":"ContainerDied","Data":"6175fdd46891bd81af38fd579143a31ab1e715b0466c87fce1a1553120da1c14"} Dec 01 20:15:02 crc kubenswrapper[4802]: I1201 20:15:02.202408 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-jbfz7" Dec 01 20:15:03 crc kubenswrapper[4802]: I1201 20:15:03.490527 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410335-8p7k7" Dec 01 20:15:03 crc kubenswrapper[4802]: I1201 20:15:03.614311 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9knwv\" (UniqueName: \"kubernetes.io/projected/dc4bc05f-4541-4ffe-84b9-b3b54d244094-kube-api-access-9knwv\") pod \"dc4bc05f-4541-4ffe-84b9-b3b54d244094\" (UID: \"dc4bc05f-4541-4ffe-84b9-b3b54d244094\") " Dec 01 20:15:03 crc kubenswrapper[4802]: I1201 20:15:03.614426 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc4bc05f-4541-4ffe-84b9-b3b54d244094-secret-volume\") pod \"dc4bc05f-4541-4ffe-84b9-b3b54d244094\" (UID: \"dc4bc05f-4541-4ffe-84b9-b3b54d244094\") " Dec 01 20:15:03 crc kubenswrapper[4802]: I1201 20:15:03.614636 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc4bc05f-4541-4ffe-84b9-b3b54d244094-config-volume\") pod \"dc4bc05f-4541-4ffe-84b9-b3b54d244094\" (UID: \"dc4bc05f-4541-4ffe-84b9-b3b54d244094\") " Dec 01 20:15:03 crc kubenswrapper[4802]: I1201 20:15:03.615683 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc4bc05f-4541-4ffe-84b9-b3b54d244094-config-volume" (OuterVolumeSpecName: "config-volume") pod "dc4bc05f-4541-4ffe-84b9-b3b54d244094" (UID: "dc4bc05f-4541-4ffe-84b9-b3b54d244094"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:15:03 crc kubenswrapper[4802]: I1201 20:15:03.621091 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc4bc05f-4541-4ffe-84b9-b3b54d244094-kube-api-access-9knwv" (OuterVolumeSpecName: "kube-api-access-9knwv") pod "dc4bc05f-4541-4ffe-84b9-b3b54d244094" (UID: "dc4bc05f-4541-4ffe-84b9-b3b54d244094"). InnerVolumeSpecName "kube-api-access-9knwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:15:03 crc kubenswrapper[4802]: I1201 20:15:03.621115 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc4bc05f-4541-4ffe-84b9-b3b54d244094-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dc4bc05f-4541-4ffe-84b9-b3b54d244094" (UID: "dc4bc05f-4541-4ffe-84b9-b3b54d244094"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:15:03 crc kubenswrapper[4802]: I1201 20:15:03.716909 4802 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc4bc05f-4541-4ffe-84b9-b3b54d244094-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 20:15:03 crc kubenswrapper[4802]: I1201 20:15:03.716969 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9knwv\" (UniqueName: \"kubernetes.io/projected/dc4bc05f-4541-4ffe-84b9-b3b54d244094-kube-api-access-9knwv\") on node \"crc\" DevicePath \"\"" Dec 01 20:15:03 crc kubenswrapper[4802]: I1201 20:15:03.716986 4802 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc4bc05f-4541-4ffe-84b9-b3b54d244094-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 20:15:04 crc kubenswrapper[4802]: I1201 20:15:04.213432 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410335-8p7k7" event={"ID":"dc4bc05f-4541-4ffe-84b9-b3b54d244094","Type":"ContainerDied","Data":"872ea8c60f04497256635647c7d851c29dd270639b958ee730bd453521e6951b"} Dec 01 20:15:04 crc kubenswrapper[4802]: I1201 20:15:04.213486 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="872ea8c60f04497256635647c7d851c29dd270639b958ee730bd453521e6951b" Dec 01 20:15:04 crc kubenswrapper[4802]: I1201 20:15:04.213546 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410335-8p7k7" Dec 01 20:15:06 crc kubenswrapper[4802]: I1201 20:15:06.984972 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4552rv" Dec 01 20:15:10 crc kubenswrapper[4802]: I1201 20:15:10.808748 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-5ftkv" Dec 01 20:15:10 crc kubenswrapper[4802]: I1201 20:15:10.898976 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-jcmlp" Dec 01 20:15:10 crc kubenswrapper[4802]: I1201 20:15:10.973981 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-2nm2t" Dec 01 20:15:11 crc kubenswrapper[4802]: I1201 20:15:11.123121 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cswfs" Dec 01 20:15:11 crc kubenswrapper[4802]: I1201 20:15:11.136051 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-p475v" Dec 01 20:15:11 crc kubenswrapper[4802]: I1201 20:15:11.274594 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-vb97q" Dec 01 20:15:11 crc kubenswrapper[4802]: I1201 20:15:11.276443 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-djvhl" event={"ID":"c7839b31-af95-4d33-a954-9615ea0c87a6","Type":"ContainerStarted","Data":"1d49468f05248e4eefd01e03b89fc4d55caf03ae0c2b5c89c42f4192443deea6"} Dec 01 20:15:11 crc kubenswrapper[4802]: I1201 20:15:11.276806 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-djvhl" Dec 01 20:15:11 crc kubenswrapper[4802]: I1201 20:15:11.328159 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-djvhl" podStartSLOduration=4.086429422 podStartE2EDuration="1m1.328127522s" podCreationTimestamp="2025-12-01 20:14:10 +0000 UTC" firstStartedPulling="2025-12-01 20:14:13.177423645 +0000 UTC m=+1074.739983286" lastFinishedPulling="2025-12-01 20:15:10.419121735 +0000 UTC m=+1131.981681386" observedRunningTime="2025-12-01 20:15:11.327730979 +0000 UTC m=+1132.890290620" watchObservedRunningTime="2025-12-01 20:15:11.328127522 +0000 UTC m=+1132.890687163" Dec 01 20:15:11 crc kubenswrapper[4802]: I1201 20:15:11.616805 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r89vq" Dec 01 20:15:21 crc kubenswrapper[4802]: I1201 20:15:21.239299 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-djvhl" Dec 01 20:15:28 crc kubenswrapper[4802]: I1201 20:15:28.088372 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:15:28 crc kubenswrapper[4802]: I1201 20:15:28.089021 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:15:28 crc kubenswrapper[4802]: I1201 20:15:28.089079 4802 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 20:15:28 crc kubenswrapper[4802]: I1201 20:15:28.089962 4802 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cdab3d21e9b678aa33ac2623e4e7233c7b5184d7898d89b02a2bb479a6f32dd9"} pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 20:15:28 crc kubenswrapper[4802]: I1201 20:15:28.090047 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" containerID="cri-o://cdab3d21e9b678aa33ac2623e4e7233c7b5184d7898d89b02a2bb479a6f32dd9" gracePeriod=600 Dec 01 20:15:29 crc kubenswrapper[4802]: I1201 20:15:29.460303 4802 generic.go:334] "Generic (PLEG): container finished" podID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerID="cdab3d21e9b678aa33ac2623e4e7233c7b5184d7898d89b02a2bb479a6f32dd9" exitCode=0 Dec 01 20:15:29 crc kubenswrapper[4802]: I1201 20:15:29.460370 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerDied","Data":"cdab3d21e9b678aa33ac2623e4e7233c7b5184d7898d89b02a2bb479a6f32dd9"} Dec 01 20:15:29 crc kubenswrapper[4802]: I1201 20:15:29.461505 4802 scope.go:117] "RemoveContainer" containerID="b5dce8f18ad191d77889a5744c4581ae578cf1b86f80207070351f56ae6cf862" Dec 01 20:15:30 crc kubenswrapper[4802]: I1201 20:15:30.474215 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerStarted","Data":"64791094ee9b4b30a04bff1aa7e941faf215ab40eb68a1d66d92dede04b50331"} Dec 01 20:15:37 crc kubenswrapper[4802]: I1201 20:15:37.769265 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dwwkt"] Dec 01 20:15:37 crc kubenswrapper[4802]: E1201 20:15:37.770364 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4bc05f-4541-4ffe-84b9-b3b54d244094" containerName="collect-profiles" Dec 01 20:15:37 crc kubenswrapper[4802]: I1201 20:15:37.770382 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4bc05f-4541-4ffe-84b9-b3b54d244094" containerName="collect-profiles" Dec 01 20:15:37 crc kubenswrapper[4802]: I1201 20:15:37.770580 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc4bc05f-4541-4ffe-84b9-b3b54d244094" containerName="collect-profiles" Dec 01 20:15:37 crc kubenswrapper[4802]: I1201 20:15:37.771483 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-dwwkt" Dec 01 20:15:37 crc kubenswrapper[4802]: I1201 20:15:37.774151 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-9d6n9" Dec 01 20:15:37 crc kubenswrapper[4802]: I1201 20:15:37.774438 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 01 20:15:37 crc kubenswrapper[4802]: I1201 20:15:37.774627 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 01 20:15:37 crc kubenswrapper[4802]: I1201 20:15:37.775776 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 01 20:15:37 crc kubenswrapper[4802]: I1201 20:15:37.794501 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dwwkt"] Dec 01 20:15:37 crc kubenswrapper[4802]: I1201 20:15:37.866902 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4hkc5"] Dec 01 20:15:37 crc kubenswrapper[4802]: I1201 20:15:37.868474 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4hkc5" Dec 01 20:15:37 crc kubenswrapper[4802]: I1201 20:15:37.872779 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 01 20:15:37 crc kubenswrapper[4802]: I1201 20:15:37.875731 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4hkc5"] Dec 01 20:15:37 crc kubenswrapper[4802]: I1201 20:15:37.882337 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f91bd0d2-43a6-4de9-a566-3894686f1da3-config\") pod \"dnsmasq-dns-675f4bcbfc-dwwkt\" (UID: \"f91bd0d2-43a6-4de9-a566-3894686f1da3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dwwkt" Dec 01 20:15:37 crc kubenswrapper[4802]: I1201 20:15:37.882427 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prtbc\" (UniqueName: \"kubernetes.io/projected/f91bd0d2-43a6-4de9-a566-3894686f1da3-kube-api-access-prtbc\") pod \"dnsmasq-dns-675f4bcbfc-dwwkt\" (UID: \"f91bd0d2-43a6-4de9-a566-3894686f1da3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dwwkt" Dec 01 20:15:37 crc kubenswrapper[4802]: I1201 20:15:37.882504 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gtw9\" (UniqueName: \"kubernetes.io/projected/7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6-kube-api-access-5gtw9\") pod \"dnsmasq-dns-78dd6ddcc-4hkc5\" (UID: \"7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4hkc5" Dec 01 20:15:37 crc kubenswrapper[4802]: I1201 20:15:37.882532 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4hkc5\" (UID: \"7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4hkc5" Dec 01 20:15:37 crc kubenswrapper[4802]: I1201 20:15:37.882608 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6-config\") pod \"dnsmasq-dns-78dd6ddcc-4hkc5\" (UID: \"7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4hkc5" Dec 01 20:15:37 crc kubenswrapper[4802]: I1201 20:15:37.983525 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prtbc\" (UniqueName: \"kubernetes.io/projected/f91bd0d2-43a6-4de9-a566-3894686f1da3-kube-api-access-prtbc\") pod \"dnsmasq-dns-675f4bcbfc-dwwkt\" (UID: \"f91bd0d2-43a6-4de9-a566-3894686f1da3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dwwkt" Dec 01 20:15:37 crc kubenswrapper[4802]: I1201 20:15:37.983602 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gtw9\" (UniqueName: \"kubernetes.io/projected/7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6-kube-api-access-5gtw9\") pod \"dnsmasq-dns-78dd6ddcc-4hkc5\" (UID: \"7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4hkc5" Dec 01 20:15:37 crc kubenswrapper[4802]: I1201 20:15:37.983866 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4hkc5\" (UID: \"7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4hkc5" Dec 01 20:15:37 crc kubenswrapper[4802]: I1201 20:15:37.984032 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6-config\") pod \"dnsmasq-dns-78dd6ddcc-4hkc5\" (UID: \"7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4hkc5" Dec 01 20:15:37 crc kubenswrapper[4802]: I1201 20:15:37.984261 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f91bd0d2-43a6-4de9-a566-3894686f1da3-config\") pod \"dnsmasq-dns-675f4bcbfc-dwwkt\" (UID: \"f91bd0d2-43a6-4de9-a566-3894686f1da3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dwwkt" Dec 01 20:15:37 crc kubenswrapper[4802]: I1201 20:15:37.984944 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f91bd0d2-43a6-4de9-a566-3894686f1da3-config\") pod \"dnsmasq-dns-675f4bcbfc-dwwkt\" (UID: \"f91bd0d2-43a6-4de9-a566-3894686f1da3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dwwkt" Dec 01 20:15:37 crc kubenswrapper[4802]: I1201 20:15:37.984946 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4hkc5\" (UID: \"7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4hkc5" Dec 01 20:15:37 crc kubenswrapper[4802]: I1201 20:15:37.985027 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6-config\") pod \"dnsmasq-dns-78dd6ddcc-4hkc5\" (UID: \"7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4hkc5" Dec 01 20:15:38 crc kubenswrapper[4802]: I1201 20:15:38.013777 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gtw9\" (UniqueName: \"kubernetes.io/projected/7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6-kube-api-access-5gtw9\") pod \"dnsmasq-dns-78dd6ddcc-4hkc5\" (UID: \"7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4hkc5" Dec 01 20:15:38 crc kubenswrapper[4802]: I1201 20:15:38.014760 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prtbc\" (UniqueName: \"kubernetes.io/projected/f91bd0d2-43a6-4de9-a566-3894686f1da3-kube-api-access-prtbc\") pod \"dnsmasq-dns-675f4bcbfc-dwwkt\" (UID: \"f91bd0d2-43a6-4de9-a566-3894686f1da3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dwwkt" Dec 01 20:15:38 crc kubenswrapper[4802]: I1201 20:15:38.101679 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-dwwkt" Dec 01 20:15:38 crc kubenswrapper[4802]: I1201 20:15:38.191632 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4hkc5" Dec 01 20:15:38 crc kubenswrapper[4802]: I1201 20:15:38.686396 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dwwkt"] Dec 01 20:15:38 crc kubenswrapper[4802]: W1201 20:15:38.770111 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e4f2ce2_fc00_47aa_a0ac_3f7f9aa928c6.slice/crio-4d066aad7eb61bd984e0e2ed157f36c23ab7dc7b733d2db53d4269494ede462d WatchSource:0}: Error finding container 4d066aad7eb61bd984e0e2ed157f36c23ab7dc7b733d2db53d4269494ede462d: Status 404 returned error can't find the container with id 4d066aad7eb61bd984e0e2ed157f36c23ab7dc7b733d2db53d4269494ede462d Dec 01 20:15:38 crc kubenswrapper[4802]: I1201 20:15:38.783135 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4hkc5"] Dec 01 20:15:39 crc kubenswrapper[4802]: I1201 20:15:39.560399 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-4hkc5" event={"ID":"7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6","Type":"ContainerStarted","Data":"4d066aad7eb61bd984e0e2ed157f36c23ab7dc7b733d2db53d4269494ede462d"} Dec 01 20:15:39 crc kubenswrapper[4802]: I1201 20:15:39.564786 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-dwwkt" event={"ID":"f91bd0d2-43a6-4de9-a566-3894686f1da3","Type":"ContainerStarted","Data":"3db66135b9a0a4f31d8bf2d73347fe436a6cc4f3d7240c0fcb61cbe71878792e"} Dec 01 20:15:40 crc kubenswrapper[4802]: I1201 20:15:40.646269 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dwwkt"] Dec 01 20:15:40 crc kubenswrapper[4802]: I1201 20:15:40.656127 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-qgtwk"] Dec 01 20:15:40 crc kubenswrapper[4802]: I1201 20:15:40.662249 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-qgtwk" Dec 01 20:15:40 crc kubenswrapper[4802]: I1201 20:15:40.671792 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-qgtwk"] Dec 01 20:15:40 crc kubenswrapper[4802]: I1201 20:15:40.742453 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34c04a7f-117a-421d-8cee-10446cb66fa3-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-qgtwk\" (UID: \"34c04a7f-117a-421d-8cee-10446cb66fa3\") " pod="openstack/dnsmasq-dns-5ccc8479f9-qgtwk" Dec 01 20:15:40 crc kubenswrapper[4802]: I1201 20:15:40.742712 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfdbs\" (UniqueName: \"kubernetes.io/projected/34c04a7f-117a-421d-8cee-10446cb66fa3-kube-api-access-nfdbs\") pod \"dnsmasq-dns-5ccc8479f9-qgtwk\" (UID: \"34c04a7f-117a-421d-8cee-10446cb66fa3\") " pod="openstack/dnsmasq-dns-5ccc8479f9-qgtwk" Dec 01 20:15:40 crc kubenswrapper[4802]: I1201 20:15:40.742739 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34c04a7f-117a-421d-8cee-10446cb66fa3-config\") pod \"dnsmasq-dns-5ccc8479f9-qgtwk\" (UID: \"34c04a7f-117a-421d-8cee-10446cb66fa3\") " pod="openstack/dnsmasq-dns-5ccc8479f9-qgtwk" Dec 01 20:15:40 crc kubenswrapper[4802]: I1201 20:15:40.843872 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34c04a7f-117a-421d-8cee-10446cb66fa3-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-qgtwk\" (UID: \"34c04a7f-117a-421d-8cee-10446cb66fa3\") " pod="openstack/dnsmasq-dns-5ccc8479f9-qgtwk" Dec 01 20:15:40 crc kubenswrapper[4802]: I1201 20:15:40.843945 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfdbs\" (UniqueName: \"kubernetes.io/projected/34c04a7f-117a-421d-8cee-10446cb66fa3-kube-api-access-nfdbs\") pod \"dnsmasq-dns-5ccc8479f9-qgtwk\" (UID: \"34c04a7f-117a-421d-8cee-10446cb66fa3\") " pod="openstack/dnsmasq-dns-5ccc8479f9-qgtwk" Dec 01 20:15:40 crc kubenswrapper[4802]: I1201 20:15:40.844000 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34c04a7f-117a-421d-8cee-10446cb66fa3-config\") pod \"dnsmasq-dns-5ccc8479f9-qgtwk\" (UID: \"34c04a7f-117a-421d-8cee-10446cb66fa3\") " pod="openstack/dnsmasq-dns-5ccc8479f9-qgtwk" Dec 01 20:15:40 crc kubenswrapper[4802]: I1201 20:15:40.844969 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34c04a7f-117a-421d-8cee-10446cb66fa3-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-qgtwk\" (UID: \"34c04a7f-117a-421d-8cee-10446cb66fa3\") " pod="openstack/dnsmasq-dns-5ccc8479f9-qgtwk" Dec 01 20:15:40 crc kubenswrapper[4802]: I1201 20:15:40.846009 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34c04a7f-117a-421d-8cee-10446cb66fa3-config\") pod \"dnsmasq-dns-5ccc8479f9-qgtwk\" (UID: \"34c04a7f-117a-421d-8cee-10446cb66fa3\") " pod="openstack/dnsmasq-dns-5ccc8479f9-qgtwk" Dec 01 20:15:40 crc kubenswrapper[4802]: I1201 20:15:40.897970 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfdbs\" (UniqueName: \"kubernetes.io/projected/34c04a7f-117a-421d-8cee-10446cb66fa3-kube-api-access-nfdbs\") pod \"dnsmasq-dns-5ccc8479f9-qgtwk\" (UID: \"34c04a7f-117a-421d-8cee-10446cb66fa3\") " pod="openstack/dnsmasq-dns-5ccc8479f9-qgtwk" Dec 01 20:15:40 crc kubenswrapper[4802]: I1201 20:15:40.942176 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4hkc5"] Dec 01 20:15:40 crc kubenswrapper[4802]: I1201 20:15:40.980149 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t92k5"] Dec 01 20:15:40 crc kubenswrapper[4802]: I1201 20:15:40.981439 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-t92k5" Dec 01 20:15:40 crc kubenswrapper[4802]: I1201 20:15:40.990149 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-qgtwk" Dec 01 20:15:40 crc kubenswrapper[4802]: I1201 20:15:40.998782 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t92k5"] Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.148090 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2lt5\" (UniqueName: \"kubernetes.io/projected/bbfc7a1b-086f-4868-817a-61710c05dae0-kube-api-access-d2lt5\") pod \"dnsmasq-dns-57d769cc4f-t92k5\" (UID: \"bbfc7a1b-086f-4868-817a-61710c05dae0\") " pod="openstack/dnsmasq-dns-57d769cc4f-t92k5" Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.148186 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbfc7a1b-086f-4868-817a-61710c05dae0-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-t92k5\" (UID: \"bbfc7a1b-086f-4868-817a-61710c05dae0\") " pod="openstack/dnsmasq-dns-57d769cc4f-t92k5" Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.148647 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbfc7a1b-086f-4868-817a-61710c05dae0-config\") pod \"dnsmasq-dns-57d769cc4f-t92k5\" (UID: \"bbfc7a1b-086f-4868-817a-61710c05dae0\") " pod="openstack/dnsmasq-dns-57d769cc4f-t92k5" Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.250018 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2lt5\" (UniqueName: \"kubernetes.io/projected/bbfc7a1b-086f-4868-817a-61710c05dae0-kube-api-access-d2lt5\") pod \"dnsmasq-dns-57d769cc4f-t92k5\" (UID: \"bbfc7a1b-086f-4868-817a-61710c05dae0\") " pod="openstack/dnsmasq-dns-57d769cc4f-t92k5" Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.250083 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbfc7a1b-086f-4868-817a-61710c05dae0-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-t92k5\" (UID: \"bbfc7a1b-086f-4868-817a-61710c05dae0\") " pod="openstack/dnsmasq-dns-57d769cc4f-t92k5" Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.250175 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbfc7a1b-086f-4868-817a-61710c05dae0-config\") pod \"dnsmasq-dns-57d769cc4f-t92k5\" (UID: \"bbfc7a1b-086f-4868-817a-61710c05dae0\") " pod="openstack/dnsmasq-dns-57d769cc4f-t92k5" Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.250978 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbfc7a1b-086f-4868-817a-61710c05dae0-config\") pod \"dnsmasq-dns-57d769cc4f-t92k5\" (UID: \"bbfc7a1b-086f-4868-817a-61710c05dae0\") " pod="openstack/dnsmasq-dns-57d769cc4f-t92k5" Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.251365 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbfc7a1b-086f-4868-817a-61710c05dae0-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-t92k5\" (UID: \"bbfc7a1b-086f-4868-817a-61710c05dae0\") " pod="openstack/dnsmasq-dns-57d769cc4f-t92k5" Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.268521 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2lt5\" (UniqueName: \"kubernetes.io/projected/bbfc7a1b-086f-4868-817a-61710c05dae0-kube-api-access-d2lt5\") pod \"dnsmasq-dns-57d769cc4f-t92k5\" (UID: \"bbfc7a1b-086f-4868-817a-61710c05dae0\") " pod="openstack/dnsmasq-dns-57d769cc4f-t92k5" Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.309495 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-t92k5" Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.517499 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-qgtwk"] Dec 01 20:15:41 crc kubenswrapper[4802]: W1201 20:15:41.521851 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34c04a7f_117a_421d_8cee_10446cb66fa3.slice/crio-ff4f2a359fd96059990646df314186f660f13da3b6e9cd8372b6d9cf1508b216 WatchSource:0}: Error finding container ff4f2a359fd96059990646df314186f660f13da3b6e9cd8372b6d9cf1508b216: Status 404 returned error can't find the container with id ff4f2a359fd96059990646df314186f660f13da3b6e9cd8372b6d9cf1508b216 Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.606371 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-qgtwk" event={"ID":"34c04a7f-117a-421d-8cee-10446cb66fa3","Type":"ContainerStarted","Data":"ff4f2a359fd96059990646df314186f660f13da3b6e9cd8372b6d9cf1508b216"} Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.794333 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.796008 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.798519 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.802424 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.802650 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-dfrbx" Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.802782 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.802831 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.802898 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.803030 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.807789 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.913358 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t92k5"] Dec 01 20:15:41 crc kubenswrapper[4802]: W1201 20:15:41.919478 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbfc7a1b_086f_4868_817a_61710c05dae0.slice/crio-510fd88f834d7be845e7d9041a83e1ecba11c5293615fb0b5ad18ff83fd6d86b WatchSource:0}: Error finding container 510fd88f834d7be845e7d9041a83e1ecba11c5293615fb0b5ad18ff83fd6d86b: Status 404 returned error can't find the container with id 510fd88f834d7be845e7d9041a83e1ecba11c5293615fb0b5ad18ff83fd6d86b Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.971353 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.971394 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.971429 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.971460 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.971477 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptsz2\" (UniqueName: \"kubernetes.io/projected/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-kube-api-access-ptsz2\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.971575 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.971648 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.971707 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.971730 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.971786 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:41 crc kubenswrapper[4802]: I1201 20:15:41.971831 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.073173 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.073253 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.073691 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.073290 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.074092 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.074111 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptsz2\" (UniqueName: \"kubernetes.io/projected/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-kube-api-access-ptsz2\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.074133 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.074148 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.074166 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.074182 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.074218 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.074238 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.074693 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.074800 4802 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.075432 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.076107 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.077431 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.078688 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.079873 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.080426 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.095412 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptsz2\" (UniqueName: \"kubernetes.io/projected/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-kube-api-access-ptsz2\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.100445 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.103110 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.105606 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.107170 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.108864 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.108875 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.109272 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.109805 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.109981 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.110170 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.111984 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-8cclk" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.125207 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.135643 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.277056 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.277105 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/563ae8fc-e33c-402e-8901-79434cf68179-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.277126 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tc9c\" (UniqueName: \"kubernetes.io/projected/563ae8fc-e33c-402e-8901-79434cf68179-kube-api-access-9tc9c\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.277155 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/563ae8fc-e33c-402e-8901-79434cf68179-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.277180 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/563ae8fc-e33c-402e-8901-79434cf68179-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.277242 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/563ae8fc-e33c-402e-8901-79434cf68179-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.277290 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/563ae8fc-e33c-402e-8901-79434cf68179-pod-info\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.277313 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/563ae8fc-e33c-402e-8901-79434cf68179-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.277330 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/563ae8fc-e33c-402e-8901-79434cf68179-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.277353 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/563ae8fc-e33c-402e-8901-79434cf68179-server-conf\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.277382 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/563ae8fc-e33c-402e-8901-79434cf68179-config-data\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.378333 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/563ae8fc-e33c-402e-8901-79434cf68179-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.378364 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/563ae8fc-e33c-402e-8901-79434cf68179-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.378396 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/563ae8fc-e33c-402e-8901-79434cf68179-server-conf\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.378426 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/563ae8fc-e33c-402e-8901-79434cf68179-config-data\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.378452 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.378472 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/563ae8fc-e33c-402e-8901-79434cf68179-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.378488 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tc9c\" (UniqueName: \"kubernetes.io/projected/563ae8fc-e33c-402e-8901-79434cf68179-kube-api-access-9tc9c\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.378515 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/563ae8fc-e33c-402e-8901-79434cf68179-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.378539 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/563ae8fc-e33c-402e-8901-79434cf68179-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.378567 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/563ae8fc-e33c-402e-8901-79434cf68179-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.378638 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/563ae8fc-e33c-402e-8901-79434cf68179-pod-info\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.379269 4802 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.379798 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/563ae8fc-e33c-402e-8901-79434cf68179-config-data\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.380918 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/563ae8fc-e33c-402e-8901-79434cf68179-server-conf\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.384242 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/563ae8fc-e33c-402e-8901-79434cf68179-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.384717 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/563ae8fc-e33c-402e-8901-79434cf68179-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.384812 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/563ae8fc-e33c-402e-8901-79434cf68179-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.391400 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/563ae8fc-e33c-402e-8901-79434cf68179-pod-info\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.392551 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/563ae8fc-e33c-402e-8901-79434cf68179-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.392847 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/563ae8fc-e33c-402e-8901-79434cf68179-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.393927 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/563ae8fc-e33c-402e-8901-79434cf68179-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.424943 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tc9c\" (UniqueName: \"kubernetes.io/projected/563ae8fc-e33c-402e-8901-79434cf68179-kube-api-access-9tc9c\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.443928 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.483230 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.592773 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 20:15:42 crc kubenswrapper[4802]: W1201 20:15:42.624624 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97e35ed2_d0e5_4f29_8869_3740e22f5cd9.slice/crio-a3e44ee4e6433e4d54b3f5b625cf43c653949c17727eafe548e6cc7dae4cfa9c WatchSource:0}: Error finding container a3e44ee4e6433e4d54b3f5b625cf43c653949c17727eafe548e6cc7dae4cfa9c: Status 404 returned error can't find the container with id a3e44ee4e6433e4d54b3f5b625cf43c653949c17727eafe548e6cc7dae4cfa9c Dec 01 20:15:42 crc kubenswrapper[4802]: I1201 20:15:42.631005 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-t92k5" event={"ID":"bbfc7a1b-086f-4868-817a-61710c05dae0","Type":"ContainerStarted","Data":"510fd88f834d7be845e7d9041a83e1ecba11c5293615fb0b5ad18ff83fd6d86b"} Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.317000 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.669529 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"563ae8fc-e33c-402e-8901-79434cf68179","Type":"ContainerStarted","Data":"85ef24648f2d17c68856a4994116a5efe34c0eb167586eb0766c35be5d059735"} Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.671549 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"97e35ed2-d0e5-4f29-8869-3740e22f5cd9","Type":"ContainerStarted","Data":"a3e44ee4e6433e4d54b3f5b625cf43c653949c17727eafe548e6cc7dae4cfa9c"} Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.697914 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.699515 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.706521 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.707574 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-p5lzl" Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.709506 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.715023 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.717295 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.841327 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.865936 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0f360e58-7047-4369-a8c8-4e0394586f62-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0f360e58-7047-4369-a8c8-4e0394586f62\") " pod="openstack/openstack-galera-0" Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.866011 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0f360e58-7047-4369-a8c8-4e0394586f62-config-data-default\") pod \"openstack-galera-0\" (UID: \"0f360e58-7047-4369-a8c8-4e0394586f62\") " pod="openstack/openstack-galera-0" Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.866060 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0f360e58-7047-4369-a8c8-4e0394586f62-kolla-config\") pod \"openstack-galera-0\" (UID: \"0f360e58-7047-4369-a8c8-4e0394586f62\") " pod="openstack/openstack-galera-0" Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.866134 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dhfg\" (UniqueName: \"kubernetes.io/projected/0f360e58-7047-4369-a8c8-4e0394586f62-kube-api-access-2dhfg\") pod \"openstack-galera-0\" (UID: \"0f360e58-7047-4369-a8c8-4e0394586f62\") " pod="openstack/openstack-galera-0" Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.866241 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f360e58-7047-4369-a8c8-4e0394586f62-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0f360e58-7047-4369-a8c8-4e0394586f62\") " pod="openstack/openstack-galera-0" Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.866269 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f360e58-7047-4369-a8c8-4e0394586f62-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0f360e58-7047-4369-a8c8-4e0394586f62\") " pod="openstack/openstack-galera-0" Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.866297 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f360e58-7047-4369-a8c8-4e0394586f62-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0f360e58-7047-4369-a8c8-4e0394586f62\") " pod="openstack/openstack-galera-0" Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.866330 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"0f360e58-7047-4369-a8c8-4e0394586f62\") " pod="openstack/openstack-galera-0" Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.977861 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dhfg\" (UniqueName: \"kubernetes.io/projected/0f360e58-7047-4369-a8c8-4e0394586f62-kube-api-access-2dhfg\") pod \"openstack-galera-0\" (UID: \"0f360e58-7047-4369-a8c8-4e0394586f62\") " pod="openstack/openstack-galera-0" Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.978707 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f360e58-7047-4369-a8c8-4e0394586f62-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0f360e58-7047-4369-a8c8-4e0394586f62\") " pod="openstack/openstack-galera-0" Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.978762 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f360e58-7047-4369-a8c8-4e0394586f62-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0f360e58-7047-4369-a8c8-4e0394586f62\") " pod="openstack/openstack-galera-0" Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.978792 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f360e58-7047-4369-a8c8-4e0394586f62-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0f360e58-7047-4369-a8c8-4e0394586f62\") " pod="openstack/openstack-galera-0" Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.978832 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"0f360e58-7047-4369-a8c8-4e0394586f62\") " pod="openstack/openstack-galera-0" Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.978955 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0f360e58-7047-4369-a8c8-4e0394586f62-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0f360e58-7047-4369-a8c8-4e0394586f62\") " pod="openstack/openstack-galera-0" Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.978984 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0f360e58-7047-4369-a8c8-4e0394586f62-config-data-default\") pod \"openstack-galera-0\" (UID: \"0f360e58-7047-4369-a8c8-4e0394586f62\") " pod="openstack/openstack-galera-0" Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.979038 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0f360e58-7047-4369-a8c8-4e0394586f62-kolla-config\") pod \"openstack-galera-0\" (UID: \"0f360e58-7047-4369-a8c8-4e0394586f62\") " pod="openstack/openstack-galera-0" Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.980325 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0f360e58-7047-4369-a8c8-4e0394586f62-kolla-config\") pod \"openstack-galera-0\" (UID: \"0f360e58-7047-4369-a8c8-4e0394586f62\") " pod="openstack/openstack-galera-0" Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.984978 4802 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"0f360e58-7047-4369-a8c8-4e0394586f62\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-galera-0" Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.986823 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0f360e58-7047-4369-a8c8-4e0394586f62-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0f360e58-7047-4369-a8c8-4e0394586f62\") " pod="openstack/openstack-galera-0" Dec 01 20:15:43 crc kubenswrapper[4802]: I1201 20:15:43.987715 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0f360e58-7047-4369-a8c8-4e0394586f62-config-data-default\") pod \"openstack-galera-0\" (UID: \"0f360e58-7047-4369-a8c8-4e0394586f62\") " pod="openstack/openstack-galera-0" Dec 01 20:15:44 crc kubenswrapper[4802]: I1201 20:15:44.000716 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f360e58-7047-4369-a8c8-4e0394586f62-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0f360e58-7047-4369-a8c8-4e0394586f62\") " pod="openstack/openstack-galera-0" Dec 01 20:15:44 crc kubenswrapper[4802]: I1201 20:15:44.001333 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f360e58-7047-4369-a8c8-4e0394586f62-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0f360e58-7047-4369-a8c8-4e0394586f62\") " pod="openstack/openstack-galera-0" Dec 01 20:15:44 crc kubenswrapper[4802]: I1201 20:15:44.014891 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dhfg\" (UniqueName: \"kubernetes.io/projected/0f360e58-7047-4369-a8c8-4e0394586f62-kube-api-access-2dhfg\") pod \"openstack-galera-0\" (UID: \"0f360e58-7047-4369-a8c8-4e0394586f62\") " pod="openstack/openstack-galera-0" Dec 01 20:15:44 crc kubenswrapper[4802]: I1201 20:15:44.020717 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f360e58-7047-4369-a8c8-4e0394586f62-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0f360e58-7047-4369-a8c8-4e0394586f62\") " pod="openstack/openstack-galera-0" Dec 01 20:15:44 crc kubenswrapper[4802]: I1201 20:15:44.064186 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"0f360e58-7047-4369-a8c8-4e0394586f62\") " pod="openstack/openstack-galera-0" Dec 01 20:15:44 crc kubenswrapper[4802]: I1201 20:15:44.221275 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.028604 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.030160 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.032314 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-jzvcq" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.033337 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.033634 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.033823 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.048325 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.106283 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/00254b08-a75a-4965-8b19-f4bc8ebf6f52-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"00254b08-a75a-4965-8b19-f4bc8ebf6f52\") " pod="openstack/openstack-cell1-galera-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.106340 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"00254b08-a75a-4965-8b19-f4bc8ebf6f52\") " pod="openstack/openstack-cell1-galera-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.106411 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/00254b08-a75a-4965-8b19-f4bc8ebf6f52-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"00254b08-a75a-4965-8b19-f4bc8ebf6f52\") " pod="openstack/openstack-cell1-galera-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.106434 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzknm\" (UniqueName: \"kubernetes.io/projected/00254b08-a75a-4965-8b19-f4bc8ebf6f52-kube-api-access-jzknm\") pod \"openstack-cell1-galera-0\" (UID: \"00254b08-a75a-4965-8b19-f4bc8ebf6f52\") " pod="openstack/openstack-cell1-galera-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.106451 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00254b08-a75a-4965-8b19-f4bc8ebf6f52-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"00254b08-a75a-4965-8b19-f4bc8ebf6f52\") " pod="openstack/openstack-cell1-galera-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.106470 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00254b08-a75a-4965-8b19-f4bc8ebf6f52-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"00254b08-a75a-4965-8b19-f4bc8ebf6f52\") " pod="openstack/openstack-cell1-galera-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.106497 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/00254b08-a75a-4965-8b19-f4bc8ebf6f52-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"00254b08-a75a-4965-8b19-f4bc8ebf6f52\") " pod="openstack/openstack-cell1-galera-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.106521 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/00254b08-a75a-4965-8b19-f4bc8ebf6f52-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"00254b08-a75a-4965-8b19-f4bc8ebf6f52\") " pod="openstack/openstack-cell1-galera-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.214904 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/00254b08-a75a-4965-8b19-f4bc8ebf6f52-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"00254b08-a75a-4965-8b19-f4bc8ebf6f52\") " pod="openstack/openstack-cell1-galera-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.215291 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzknm\" (UniqueName: \"kubernetes.io/projected/00254b08-a75a-4965-8b19-f4bc8ebf6f52-kube-api-access-jzknm\") pod \"openstack-cell1-galera-0\" (UID: \"00254b08-a75a-4965-8b19-f4bc8ebf6f52\") " pod="openstack/openstack-cell1-galera-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.215319 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00254b08-a75a-4965-8b19-f4bc8ebf6f52-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"00254b08-a75a-4965-8b19-f4bc8ebf6f52\") " pod="openstack/openstack-cell1-galera-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.215349 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00254b08-a75a-4965-8b19-f4bc8ebf6f52-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"00254b08-a75a-4965-8b19-f4bc8ebf6f52\") " pod="openstack/openstack-cell1-galera-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.215393 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/00254b08-a75a-4965-8b19-f4bc8ebf6f52-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"00254b08-a75a-4965-8b19-f4bc8ebf6f52\") " pod="openstack/openstack-cell1-galera-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.215432 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/00254b08-a75a-4965-8b19-f4bc8ebf6f52-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"00254b08-a75a-4965-8b19-f4bc8ebf6f52\") " pod="openstack/openstack-cell1-galera-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.215467 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/00254b08-a75a-4965-8b19-f4bc8ebf6f52-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"00254b08-a75a-4965-8b19-f4bc8ebf6f52\") " pod="openstack/openstack-cell1-galera-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.215500 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"00254b08-a75a-4965-8b19-f4bc8ebf6f52\") " pod="openstack/openstack-cell1-galera-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.215727 4802 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"00254b08-a75a-4965-8b19-f4bc8ebf6f52\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-cell1-galera-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.216668 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/00254b08-a75a-4965-8b19-f4bc8ebf6f52-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"00254b08-a75a-4965-8b19-f4bc8ebf6f52\") " pod="openstack/openstack-cell1-galera-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.217662 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/00254b08-a75a-4965-8b19-f4bc8ebf6f52-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"00254b08-a75a-4965-8b19-f4bc8ebf6f52\") " pod="openstack/openstack-cell1-galera-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.218899 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/00254b08-a75a-4965-8b19-f4bc8ebf6f52-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"00254b08-a75a-4965-8b19-f4bc8ebf6f52\") " pod="openstack/openstack-cell1-galera-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.221140 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00254b08-a75a-4965-8b19-f4bc8ebf6f52-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"00254b08-a75a-4965-8b19-f4bc8ebf6f52\") " pod="openstack/openstack-cell1-galera-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.221317 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.222745 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.230058 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.230575 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-q2rqr" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.230766 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.237324 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/00254b08-a75a-4965-8b19-f4bc8ebf6f52-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"00254b08-a75a-4965-8b19-f4bc8ebf6f52\") " pod="openstack/openstack-cell1-galera-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.246275 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00254b08-a75a-4965-8b19-f4bc8ebf6f52-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"00254b08-a75a-4965-8b19-f4bc8ebf6f52\") " pod="openstack/openstack-cell1-galera-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.247467 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzknm\" (UniqueName: \"kubernetes.io/projected/00254b08-a75a-4965-8b19-f4bc8ebf6f52-kube-api-access-jzknm\") pod \"openstack-cell1-galera-0\" (UID: \"00254b08-a75a-4965-8b19-f4bc8ebf6f52\") " pod="openstack/openstack-cell1-galera-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.263621 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.279356 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"00254b08-a75a-4965-8b19-f4bc8ebf6f52\") " pod="openstack/openstack-cell1-galera-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.319315 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjdxf\" (UniqueName: \"kubernetes.io/projected/06e3c630-6e2f-4fde-96ac-feea509e3dcb-kube-api-access-wjdxf\") pod \"memcached-0\" (UID: \"06e3c630-6e2f-4fde-96ac-feea509e3dcb\") " pod="openstack/memcached-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.319763 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e3c630-6e2f-4fde-96ac-feea509e3dcb-combined-ca-bundle\") pod \"memcached-0\" (UID: \"06e3c630-6e2f-4fde-96ac-feea509e3dcb\") " pod="openstack/memcached-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.320020 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/06e3c630-6e2f-4fde-96ac-feea509e3dcb-kolla-config\") pod \"memcached-0\" (UID: \"06e3c630-6e2f-4fde-96ac-feea509e3dcb\") " pod="openstack/memcached-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.320167 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/06e3c630-6e2f-4fde-96ac-feea509e3dcb-config-data\") pod \"memcached-0\" (UID: \"06e3c630-6e2f-4fde-96ac-feea509e3dcb\") " pod="openstack/memcached-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.320457 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/06e3c630-6e2f-4fde-96ac-feea509e3dcb-memcached-tls-certs\") pod \"memcached-0\" (UID: \"06e3c630-6e2f-4fde-96ac-feea509e3dcb\") " pod="openstack/memcached-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.378414 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.422598 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjdxf\" (UniqueName: \"kubernetes.io/projected/06e3c630-6e2f-4fde-96ac-feea509e3dcb-kube-api-access-wjdxf\") pod \"memcached-0\" (UID: \"06e3c630-6e2f-4fde-96ac-feea509e3dcb\") " pod="openstack/memcached-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.422651 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e3c630-6e2f-4fde-96ac-feea509e3dcb-combined-ca-bundle\") pod \"memcached-0\" (UID: \"06e3c630-6e2f-4fde-96ac-feea509e3dcb\") " pod="openstack/memcached-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.422700 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/06e3c630-6e2f-4fde-96ac-feea509e3dcb-kolla-config\") pod \"memcached-0\" (UID: \"06e3c630-6e2f-4fde-96ac-feea509e3dcb\") " pod="openstack/memcached-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.422724 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/06e3c630-6e2f-4fde-96ac-feea509e3dcb-config-data\") pod \"memcached-0\" (UID: \"06e3c630-6e2f-4fde-96ac-feea509e3dcb\") " pod="openstack/memcached-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.423115 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/06e3c630-6e2f-4fde-96ac-feea509e3dcb-memcached-tls-certs\") pod \"memcached-0\" (UID: \"06e3c630-6e2f-4fde-96ac-feea509e3dcb\") " pod="openstack/memcached-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.423659 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/06e3c630-6e2f-4fde-96ac-feea509e3dcb-config-data\") pod \"memcached-0\" (UID: \"06e3c630-6e2f-4fde-96ac-feea509e3dcb\") " pod="openstack/memcached-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.424260 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/06e3c630-6e2f-4fde-96ac-feea509e3dcb-kolla-config\") pod \"memcached-0\" (UID: \"06e3c630-6e2f-4fde-96ac-feea509e3dcb\") " pod="openstack/memcached-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.428242 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e3c630-6e2f-4fde-96ac-feea509e3dcb-combined-ca-bundle\") pod \"memcached-0\" (UID: \"06e3c630-6e2f-4fde-96ac-feea509e3dcb\") " pod="openstack/memcached-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.451115 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjdxf\" (UniqueName: \"kubernetes.io/projected/06e3c630-6e2f-4fde-96ac-feea509e3dcb-kube-api-access-wjdxf\") pod \"memcached-0\" (UID: \"06e3c630-6e2f-4fde-96ac-feea509e3dcb\") " pod="openstack/memcached-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.475403 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/06e3c630-6e2f-4fde-96ac-feea509e3dcb-memcached-tls-certs\") pod \"memcached-0\" (UID: \"06e3c630-6e2f-4fde-96ac-feea509e3dcb\") " pod="openstack/memcached-0" Dec 01 20:15:45 crc kubenswrapper[4802]: I1201 20:15:45.626071 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 01 20:15:47 crc kubenswrapper[4802]: I1201 20:15:47.155698 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 20:15:47 crc kubenswrapper[4802]: I1201 20:15:47.157071 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 20:15:47 crc kubenswrapper[4802]: I1201 20:15:47.160549 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-xgwfg" Dec 01 20:15:47 crc kubenswrapper[4802]: I1201 20:15:47.171710 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 20:15:47 crc kubenswrapper[4802]: I1201 20:15:47.258449 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hdx6\" (UniqueName: \"kubernetes.io/projected/a6a6f4e1-7593-427b-b430-42c7b351e652-kube-api-access-2hdx6\") pod \"kube-state-metrics-0\" (UID: \"a6a6f4e1-7593-427b-b430-42c7b351e652\") " pod="openstack/kube-state-metrics-0" Dec 01 20:15:47 crc kubenswrapper[4802]: I1201 20:15:47.360839 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hdx6\" (UniqueName: \"kubernetes.io/projected/a6a6f4e1-7593-427b-b430-42c7b351e652-kube-api-access-2hdx6\") pod \"kube-state-metrics-0\" (UID: \"a6a6f4e1-7593-427b-b430-42c7b351e652\") " pod="openstack/kube-state-metrics-0" Dec 01 20:15:47 crc kubenswrapper[4802]: I1201 20:15:47.414614 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hdx6\" (UniqueName: \"kubernetes.io/projected/a6a6f4e1-7593-427b-b430-42c7b351e652-kube-api-access-2hdx6\") pod \"kube-state-metrics-0\" (UID: \"a6a6f4e1-7593-427b-b430-42c7b351e652\") " pod="openstack/kube-state-metrics-0" Dec 01 20:15:47 crc kubenswrapper[4802]: I1201 20:15:47.490754 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.112922 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-gczlr"] Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.143247 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-j9thx"] Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.143385 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gczlr" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.145779 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.146076 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.147528 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-c5bdm" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.151456 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gczlr"] Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.151577 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-j9thx" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.161569 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-j9thx"] Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.209756 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/15107e62-7679-460c-ab0e-f208b4a1ec76-var-log\") pod \"ovn-controller-ovs-j9thx\" (UID: \"15107e62-7679-460c-ab0e-f208b4a1ec76\") " pod="openstack/ovn-controller-ovs-j9thx" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.209805 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bbe4b6e-302e-4d6d-bc17-5f35baca1067-combined-ca-bundle\") pod \"ovn-controller-gczlr\" (UID: \"4bbe4b6e-302e-4d6d-bc17-5f35baca1067\") " pod="openstack/ovn-controller-gczlr" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.209842 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bbe4b6e-302e-4d6d-bc17-5f35baca1067-ovn-controller-tls-certs\") pod \"ovn-controller-gczlr\" (UID: \"4bbe4b6e-302e-4d6d-bc17-5f35baca1067\") " pod="openstack/ovn-controller-gczlr" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.209910 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/15107e62-7679-460c-ab0e-f208b4a1ec76-var-lib\") pod \"ovn-controller-ovs-j9thx\" (UID: \"15107e62-7679-460c-ab0e-f208b4a1ec76\") " pod="openstack/ovn-controller-ovs-j9thx" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.209933 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/15107e62-7679-460c-ab0e-f208b4a1ec76-var-run\") pod \"ovn-controller-ovs-j9thx\" (UID: \"15107e62-7679-460c-ab0e-f208b4a1ec76\") " pod="openstack/ovn-controller-ovs-j9thx" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.209950 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w74q9\" (UniqueName: \"kubernetes.io/projected/4bbe4b6e-302e-4d6d-bc17-5f35baca1067-kube-api-access-w74q9\") pod \"ovn-controller-gczlr\" (UID: \"4bbe4b6e-302e-4d6d-bc17-5f35baca1067\") " pod="openstack/ovn-controller-gczlr" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.210138 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb8tl\" (UniqueName: \"kubernetes.io/projected/15107e62-7679-460c-ab0e-f208b4a1ec76-kube-api-access-jb8tl\") pod \"ovn-controller-ovs-j9thx\" (UID: \"15107e62-7679-460c-ab0e-f208b4a1ec76\") " pod="openstack/ovn-controller-ovs-j9thx" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.210249 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4bbe4b6e-302e-4d6d-bc17-5f35baca1067-var-run\") pod \"ovn-controller-gczlr\" (UID: \"4bbe4b6e-302e-4d6d-bc17-5f35baca1067\") " pod="openstack/ovn-controller-gczlr" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.210278 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4bbe4b6e-302e-4d6d-bc17-5f35baca1067-var-log-ovn\") pod \"ovn-controller-gczlr\" (UID: \"4bbe4b6e-302e-4d6d-bc17-5f35baca1067\") " pod="openstack/ovn-controller-gczlr" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.210348 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/15107e62-7679-460c-ab0e-f208b4a1ec76-etc-ovs\") pod \"ovn-controller-ovs-j9thx\" (UID: \"15107e62-7679-460c-ab0e-f208b4a1ec76\") " pod="openstack/ovn-controller-ovs-j9thx" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.210376 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4bbe4b6e-302e-4d6d-bc17-5f35baca1067-scripts\") pod \"ovn-controller-gczlr\" (UID: \"4bbe4b6e-302e-4d6d-bc17-5f35baca1067\") " pod="openstack/ovn-controller-gczlr" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.210400 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15107e62-7679-460c-ab0e-f208b4a1ec76-scripts\") pod \"ovn-controller-ovs-j9thx\" (UID: \"15107e62-7679-460c-ab0e-f208b4a1ec76\") " pod="openstack/ovn-controller-ovs-j9thx" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.210652 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4bbe4b6e-302e-4d6d-bc17-5f35baca1067-var-run-ovn\") pod \"ovn-controller-gczlr\" (UID: \"4bbe4b6e-302e-4d6d-bc17-5f35baca1067\") " pod="openstack/ovn-controller-gczlr" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.311114 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bbe4b6e-302e-4d6d-bc17-5f35baca1067-ovn-controller-tls-certs\") pod \"ovn-controller-gczlr\" (UID: \"4bbe4b6e-302e-4d6d-bc17-5f35baca1067\") " pod="openstack/ovn-controller-gczlr" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.311214 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/15107e62-7679-460c-ab0e-f208b4a1ec76-var-lib\") pod \"ovn-controller-ovs-j9thx\" (UID: \"15107e62-7679-460c-ab0e-f208b4a1ec76\") " pod="openstack/ovn-controller-ovs-j9thx" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.311249 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/15107e62-7679-460c-ab0e-f208b4a1ec76-var-run\") pod \"ovn-controller-ovs-j9thx\" (UID: \"15107e62-7679-460c-ab0e-f208b4a1ec76\") " pod="openstack/ovn-controller-ovs-j9thx" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.311268 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w74q9\" (UniqueName: \"kubernetes.io/projected/4bbe4b6e-302e-4d6d-bc17-5f35baca1067-kube-api-access-w74q9\") pod \"ovn-controller-gczlr\" (UID: \"4bbe4b6e-302e-4d6d-bc17-5f35baca1067\") " pod="openstack/ovn-controller-gczlr" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.311306 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb8tl\" (UniqueName: \"kubernetes.io/projected/15107e62-7679-460c-ab0e-f208b4a1ec76-kube-api-access-jb8tl\") pod \"ovn-controller-ovs-j9thx\" (UID: \"15107e62-7679-460c-ab0e-f208b4a1ec76\") " pod="openstack/ovn-controller-ovs-j9thx" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.311334 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4bbe4b6e-302e-4d6d-bc17-5f35baca1067-var-run\") pod \"ovn-controller-gczlr\" (UID: \"4bbe4b6e-302e-4d6d-bc17-5f35baca1067\") " pod="openstack/ovn-controller-gczlr" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.311353 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4bbe4b6e-302e-4d6d-bc17-5f35baca1067-var-log-ovn\") pod \"ovn-controller-gczlr\" (UID: \"4bbe4b6e-302e-4d6d-bc17-5f35baca1067\") " pod="openstack/ovn-controller-gczlr" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.311383 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/15107e62-7679-460c-ab0e-f208b4a1ec76-etc-ovs\") pod \"ovn-controller-ovs-j9thx\" (UID: \"15107e62-7679-460c-ab0e-f208b4a1ec76\") " pod="openstack/ovn-controller-ovs-j9thx" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.311406 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4bbe4b6e-302e-4d6d-bc17-5f35baca1067-scripts\") pod \"ovn-controller-gczlr\" (UID: \"4bbe4b6e-302e-4d6d-bc17-5f35baca1067\") " pod="openstack/ovn-controller-gczlr" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.312031 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15107e62-7679-460c-ab0e-f208b4a1ec76-scripts\") pod \"ovn-controller-ovs-j9thx\" (UID: \"15107e62-7679-460c-ab0e-f208b4a1ec76\") " pod="openstack/ovn-controller-ovs-j9thx" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.312067 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/15107e62-7679-460c-ab0e-f208b4a1ec76-etc-ovs\") pod \"ovn-controller-ovs-j9thx\" (UID: \"15107e62-7679-460c-ab0e-f208b4a1ec76\") " pod="openstack/ovn-controller-ovs-j9thx" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.311856 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/15107e62-7679-460c-ab0e-f208b4a1ec76-var-run\") pod \"ovn-controller-ovs-j9thx\" (UID: \"15107e62-7679-460c-ab0e-f208b4a1ec76\") " pod="openstack/ovn-controller-ovs-j9thx" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.312164 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/15107e62-7679-460c-ab0e-f208b4a1ec76-var-lib\") pod \"ovn-controller-ovs-j9thx\" (UID: \"15107e62-7679-460c-ab0e-f208b4a1ec76\") " pod="openstack/ovn-controller-ovs-j9thx" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.312218 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4bbe4b6e-302e-4d6d-bc17-5f35baca1067-var-run-ovn\") pod \"ovn-controller-gczlr\" (UID: \"4bbe4b6e-302e-4d6d-bc17-5f35baca1067\") " pod="openstack/ovn-controller-gczlr" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.311895 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4bbe4b6e-302e-4d6d-bc17-5f35baca1067-var-run\") pod \"ovn-controller-gczlr\" (UID: \"4bbe4b6e-302e-4d6d-bc17-5f35baca1067\") " pod="openstack/ovn-controller-gczlr" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.312566 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4bbe4b6e-302e-4d6d-bc17-5f35baca1067-var-run-ovn\") pod \"ovn-controller-gczlr\" (UID: \"4bbe4b6e-302e-4d6d-bc17-5f35baca1067\") " pod="openstack/ovn-controller-gczlr" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.312583 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/15107e62-7679-460c-ab0e-f208b4a1ec76-var-log\") pod \"ovn-controller-ovs-j9thx\" (UID: \"15107e62-7679-460c-ab0e-f208b4a1ec76\") " pod="openstack/ovn-controller-ovs-j9thx" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.312680 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bbe4b6e-302e-4d6d-bc17-5f35baca1067-combined-ca-bundle\") pod \"ovn-controller-gczlr\" (UID: \"4bbe4b6e-302e-4d6d-bc17-5f35baca1067\") " pod="openstack/ovn-controller-gczlr" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.312697 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/15107e62-7679-460c-ab0e-f208b4a1ec76-var-log\") pod \"ovn-controller-ovs-j9thx\" (UID: \"15107e62-7679-460c-ab0e-f208b4a1ec76\") " pod="openstack/ovn-controller-ovs-j9thx" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.312770 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4bbe4b6e-302e-4d6d-bc17-5f35baca1067-var-log-ovn\") pod \"ovn-controller-gczlr\" (UID: \"4bbe4b6e-302e-4d6d-bc17-5f35baca1067\") " pod="openstack/ovn-controller-gczlr" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.314016 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4bbe4b6e-302e-4d6d-bc17-5f35baca1067-scripts\") pod \"ovn-controller-gczlr\" (UID: \"4bbe4b6e-302e-4d6d-bc17-5f35baca1067\") " pod="openstack/ovn-controller-gczlr" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.316033 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15107e62-7679-460c-ab0e-f208b4a1ec76-scripts\") pod \"ovn-controller-ovs-j9thx\" (UID: \"15107e62-7679-460c-ab0e-f208b4a1ec76\") " pod="openstack/ovn-controller-ovs-j9thx" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.325114 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bbe4b6e-302e-4d6d-bc17-5f35baca1067-ovn-controller-tls-certs\") pod \"ovn-controller-gczlr\" (UID: \"4bbe4b6e-302e-4d6d-bc17-5f35baca1067\") " pod="openstack/ovn-controller-gczlr" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.325178 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bbe4b6e-302e-4d6d-bc17-5f35baca1067-combined-ca-bundle\") pod \"ovn-controller-gczlr\" (UID: \"4bbe4b6e-302e-4d6d-bc17-5f35baca1067\") " pod="openstack/ovn-controller-gczlr" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.330215 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb8tl\" (UniqueName: \"kubernetes.io/projected/15107e62-7679-460c-ab0e-f208b4a1ec76-kube-api-access-jb8tl\") pod \"ovn-controller-ovs-j9thx\" (UID: \"15107e62-7679-460c-ab0e-f208b4a1ec76\") " pod="openstack/ovn-controller-ovs-j9thx" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.338843 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w74q9\" (UniqueName: \"kubernetes.io/projected/4bbe4b6e-302e-4d6d-bc17-5f35baca1067-kube-api-access-w74q9\") pod \"ovn-controller-gczlr\" (UID: \"4bbe4b6e-302e-4d6d-bc17-5f35baca1067\") " pod="openstack/ovn-controller-gczlr" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.472370 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gczlr" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.481610 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-j9thx" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.712568 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.714165 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.716529 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.717158 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-n6tjk" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.718407 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.718501 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.718683 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.725615 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.819545 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c\") " pod="openstack/ovsdbserver-nb-0" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.819621 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c\") " pod="openstack/ovsdbserver-nb-0" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.819649 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c\") " pod="openstack/ovsdbserver-nb-0" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.819815 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c-config\") pod \"ovsdbserver-nb-0\" (UID: \"fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c\") " pod="openstack/ovsdbserver-nb-0" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.820029 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c\") " pod="openstack/ovsdbserver-nb-0" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.820099 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c\") " pod="openstack/ovsdbserver-nb-0" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.820141 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqv4n\" (UniqueName: \"kubernetes.io/projected/fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c-kube-api-access-gqv4n\") pod \"ovsdbserver-nb-0\" (UID: \"fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c\") " pod="openstack/ovsdbserver-nb-0" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.820167 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c\") " pod="openstack/ovsdbserver-nb-0" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.921366 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c\") " pod="openstack/ovsdbserver-nb-0" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.921429 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c-config\") pod \"ovsdbserver-nb-0\" (UID: \"fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c\") " pod="openstack/ovsdbserver-nb-0" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.921484 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c\") " pod="openstack/ovsdbserver-nb-0" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.921513 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c\") " pod="openstack/ovsdbserver-nb-0" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.921542 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqv4n\" (UniqueName: \"kubernetes.io/projected/fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c-kube-api-access-gqv4n\") pod \"ovsdbserver-nb-0\" (UID: \"fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c\") " pod="openstack/ovsdbserver-nb-0" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.921567 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c\") " pod="openstack/ovsdbserver-nb-0" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.921597 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c\") " pod="openstack/ovsdbserver-nb-0" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.922433 4802 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-nb-0" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.922607 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c\") " pod="openstack/ovsdbserver-nb-0" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.925864 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c-config\") pod \"ovsdbserver-nb-0\" (UID: \"fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c\") " pod="openstack/ovsdbserver-nb-0" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.926609 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c\") " pod="openstack/ovsdbserver-nb-0" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.927487 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c\") " pod="openstack/ovsdbserver-nb-0" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.927579 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c\") " pod="openstack/ovsdbserver-nb-0" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.929991 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c\") " pod="openstack/ovsdbserver-nb-0" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.934013 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c\") " pod="openstack/ovsdbserver-nb-0" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.941322 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqv4n\" (UniqueName: \"kubernetes.io/projected/fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c-kube-api-access-gqv4n\") pod \"ovsdbserver-nb-0\" (UID: \"fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c\") " pod="openstack/ovsdbserver-nb-0" Dec 01 20:15:51 crc kubenswrapper[4802]: I1201 20:15:51.949685 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c\") " pod="openstack/ovsdbserver-nb-0" Dec 01 20:15:52 crc kubenswrapper[4802]: I1201 20:15:52.044834 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.623469 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.627744 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.630442 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-hlz4k" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.632378 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.632569 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.633986 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.639963 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.776175 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8jnq\" (UniqueName: \"kubernetes.io/projected/e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9-kube-api-access-m8jnq\") pod \"ovsdbserver-sb-0\" (UID: \"e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9\") " pod="openstack/ovsdbserver-sb-0" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.776721 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9\") " pod="openstack/ovsdbserver-sb-0" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.777011 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9-config\") pod \"ovsdbserver-sb-0\" (UID: \"e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9\") " pod="openstack/ovsdbserver-sb-0" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.777267 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9\") " pod="openstack/ovsdbserver-sb-0" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.777492 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9\") " pod="openstack/ovsdbserver-sb-0" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.777898 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9\") " pod="openstack/ovsdbserver-sb-0" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.778082 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9\") " pod="openstack/ovsdbserver-sb-0" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.778334 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9\") " pod="openstack/ovsdbserver-sb-0" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.880789 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9\") " pod="openstack/ovsdbserver-sb-0" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.881487 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9\") " pod="openstack/ovsdbserver-sb-0" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.881558 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9\") " pod="openstack/ovsdbserver-sb-0" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.881607 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9\") " pod="openstack/ovsdbserver-sb-0" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.881711 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8jnq\" (UniqueName: \"kubernetes.io/projected/e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9-kube-api-access-m8jnq\") pod \"ovsdbserver-sb-0\" (UID: \"e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9\") " pod="openstack/ovsdbserver-sb-0" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.881785 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9\") " pod="openstack/ovsdbserver-sb-0" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.881963 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9-config\") pod \"ovsdbserver-sb-0\" (UID: \"e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9\") " pod="openstack/ovsdbserver-sb-0" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.882042 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9\") " pod="openstack/ovsdbserver-sb-0" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.882405 4802 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-sb-0" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.882701 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9\") " pod="openstack/ovsdbserver-sb-0" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.883302 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9-config\") pod \"ovsdbserver-sb-0\" (UID: \"e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9\") " pod="openstack/ovsdbserver-sb-0" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.884138 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9\") " pod="openstack/ovsdbserver-sb-0" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.889477 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9\") " pod="openstack/ovsdbserver-sb-0" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.889645 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9\") " pod="openstack/ovsdbserver-sb-0" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.890285 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9\") " pod="openstack/ovsdbserver-sb-0" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.903783 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9\") " pod="openstack/ovsdbserver-sb-0" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.912514 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8jnq\" (UniqueName: \"kubernetes.io/projected/e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9-kube-api-access-m8jnq\") pod \"ovsdbserver-sb-0\" (UID: \"e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9\") " pod="openstack/ovsdbserver-sb-0" Dec 01 20:15:54 crc kubenswrapper[4802]: I1201 20:15:54.955581 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 01 20:16:05 crc kubenswrapper[4802]: E1201 20:16:05.237262 4802 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 01 20:16:05 crc kubenswrapper[4802]: E1201 20:16:05.238000 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9tc9c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(563ae8fc-e33c-402e-8901-79434cf68179): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 20:16:05 crc kubenswrapper[4802]: E1201 20:16:05.239217 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="563ae8fc-e33c-402e-8901-79434cf68179" Dec 01 20:16:05 crc kubenswrapper[4802]: E1201 20:16:05.261399 4802 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 01 20:16:05 crc kubenswrapper[4802]: E1201 20:16:05.261654 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ptsz2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(97e35ed2-d0e5-4f29-8869-3740e22f5cd9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 20:16:05 crc kubenswrapper[4802]: E1201 20:16:05.262891 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="97e35ed2-d0e5-4f29-8869-3740e22f5cd9" Dec 01 20:16:06 crc kubenswrapper[4802]: E1201 20:16:06.125868 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="563ae8fc-e33c-402e-8901-79434cf68179" Dec 01 20:16:06 crc kubenswrapper[4802]: E1201 20:16:06.130286 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="97e35ed2-d0e5-4f29-8869-3740e22f5cd9" Dec 01 20:16:11 crc kubenswrapper[4802]: E1201 20:16:11.523901 4802 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 20:16:11 crc kubenswrapper[4802]: E1201 20:16:11.525152 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nfdbs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-qgtwk_openstack(34c04a7f-117a-421d-8cee-10446cb66fa3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 20:16:11 crc kubenswrapper[4802]: E1201 20:16:11.526378 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-qgtwk" podUID="34c04a7f-117a-421d-8cee-10446cb66fa3" Dec 01 20:16:11 crc kubenswrapper[4802]: E1201 20:16:11.527608 4802 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 20:16:11 crc kubenswrapper[4802]: E1201 20:16:11.527773 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-prtbc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-dwwkt_openstack(f91bd0d2-43a6-4de9-a566-3894686f1da3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 20:16:11 crc kubenswrapper[4802]: E1201 20:16:11.529759 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-dwwkt" podUID="f91bd0d2-43a6-4de9-a566-3894686f1da3" Dec 01 20:16:11 crc kubenswrapper[4802]: E1201 20:16:11.617466 4802 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 20:16:11 crc kubenswrapper[4802]: E1201 20:16:11.617716 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d2lt5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-t92k5_openstack(bbfc7a1b-086f-4868-817a-61710c05dae0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 20:16:11 crc kubenswrapper[4802]: E1201 20:16:11.618870 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-t92k5" podUID="bbfc7a1b-086f-4868-817a-61710c05dae0" Dec 01 20:16:11 crc kubenswrapper[4802]: E1201 20:16:11.658668 4802 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 20:16:11 crc kubenswrapper[4802]: E1201 20:16:11.658839 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5gtw9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-4hkc5_openstack(7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 20:16:11 crc kubenswrapper[4802]: E1201 20:16:11.660267 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-4hkc5" podUID="7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6" Dec 01 20:16:12 crc kubenswrapper[4802]: E1201 20:16:12.198322 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-qgtwk" podUID="34c04a7f-117a-421d-8cee-10446cb66fa3" Dec 01 20:16:12 crc kubenswrapper[4802]: E1201 20:16:12.199989 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-t92k5" podUID="bbfc7a1b-086f-4868-817a-61710c05dae0" Dec 01 20:16:12 crc kubenswrapper[4802]: I1201 20:16:12.240546 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 20:16:12 crc kubenswrapper[4802]: I1201 20:16:12.347828 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 20:16:12 crc kubenswrapper[4802]: I1201 20:16:12.472884 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gczlr"] Dec 01 20:16:12 crc kubenswrapper[4802]: I1201 20:16:12.520183 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 01 20:16:12 crc kubenswrapper[4802]: I1201 20:16:12.527060 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 01 20:16:12 crc kubenswrapper[4802]: I1201 20:16:12.608670 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 20:16:12 crc kubenswrapper[4802]: W1201 20:16:12.611816 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe3f7038_8eda_4ace_8ab0_26e6d4e5db4c.slice/crio-5853c2bdb63adff40d1a5af66530941f3ea09e172387d5a8fd78a1a819766c4f WatchSource:0}: Error finding container 5853c2bdb63adff40d1a5af66530941f3ea09e172387d5a8fd78a1a819766c4f: Status 404 returned error can't find the container with id 5853c2bdb63adff40d1a5af66530941f3ea09e172387d5a8fd78a1a819766c4f Dec 01 20:16:12 crc kubenswrapper[4802]: I1201 20:16:12.791001 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-j9thx"] Dec 01 20:16:12 crc kubenswrapper[4802]: I1201 20:16:12.815052 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-dwwkt" Dec 01 20:16:12 crc kubenswrapper[4802]: I1201 20:16:12.911249 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4hkc5" Dec 01 20:16:12 crc kubenswrapper[4802]: I1201 20:16:12.912671 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f91bd0d2-43a6-4de9-a566-3894686f1da3-config\") pod \"f91bd0d2-43a6-4de9-a566-3894686f1da3\" (UID: \"f91bd0d2-43a6-4de9-a566-3894686f1da3\") " Dec 01 20:16:12 crc kubenswrapper[4802]: I1201 20:16:12.912792 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prtbc\" (UniqueName: \"kubernetes.io/projected/f91bd0d2-43a6-4de9-a566-3894686f1da3-kube-api-access-prtbc\") pod \"f91bd0d2-43a6-4de9-a566-3894686f1da3\" (UID: \"f91bd0d2-43a6-4de9-a566-3894686f1da3\") " Dec 01 20:16:12 crc kubenswrapper[4802]: I1201 20:16:12.913459 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f91bd0d2-43a6-4de9-a566-3894686f1da3-config" (OuterVolumeSpecName: "config") pod "f91bd0d2-43a6-4de9-a566-3894686f1da3" (UID: "f91bd0d2-43a6-4de9-a566-3894686f1da3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:16:12 crc kubenswrapper[4802]: I1201 20:16:12.918789 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f91bd0d2-43a6-4de9-a566-3894686f1da3-kube-api-access-prtbc" (OuterVolumeSpecName: "kube-api-access-prtbc") pod "f91bd0d2-43a6-4de9-a566-3894686f1da3" (UID: "f91bd0d2-43a6-4de9-a566-3894686f1da3"). InnerVolumeSpecName "kube-api-access-prtbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:16:13 crc kubenswrapper[4802]: I1201 20:16:13.014118 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gtw9\" (UniqueName: \"kubernetes.io/projected/7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6-kube-api-access-5gtw9\") pod \"7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6\" (UID: \"7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6\") " Dec 01 20:16:13 crc kubenswrapper[4802]: I1201 20:16:13.014296 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6-config\") pod \"7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6\" (UID: \"7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6\") " Dec 01 20:16:13 crc kubenswrapper[4802]: I1201 20:16:13.014342 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6-dns-svc\") pod \"7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6\" (UID: \"7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6\") " Dec 01 20:16:13 crc kubenswrapper[4802]: I1201 20:16:13.014762 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prtbc\" (UniqueName: \"kubernetes.io/projected/f91bd0d2-43a6-4de9-a566-3894686f1da3-kube-api-access-prtbc\") on node \"crc\" DevicePath \"\"" Dec 01 20:16:13 crc kubenswrapper[4802]: I1201 20:16:13.014786 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f91bd0d2-43a6-4de9-a566-3894686f1da3-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:16:13 crc kubenswrapper[4802]: I1201 20:16:13.015155 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6" (UID: "7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:16:13 crc kubenswrapper[4802]: I1201 20:16:13.015335 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6-config" (OuterVolumeSpecName: "config") pod "7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6" (UID: "7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:16:13 crc kubenswrapper[4802]: I1201 20:16:13.017127 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6-kube-api-access-5gtw9" (OuterVolumeSpecName: "kube-api-access-5gtw9") pod "7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6" (UID: "7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6"). InnerVolumeSpecName "kube-api-access-5gtw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:16:13 crc kubenswrapper[4802]: I1201 20:16:13.116761 4802 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 20:16:13 crc kubenswrapper[4802]: I1201 20:16:13.117159 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gtw9\" (UniqueName: \"kubernetes.io/projected/7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6-kube-api-access-5gtw9\") on node \"crc\" DevicePath \"\"" Dec 01 20:16:13 crc kubenswrapper[4802]: I1201 20:16:13.117178 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:16:13 crc kubenswrapper[4802]: I1201 20:16:13.145655 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 20:16:13 crc kubenswrapper[4802]: I1201 20:16:13.208268 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c","Type":"ContainerStarted","Data":"5853c2bdb63adff40d1a5af66530941f3ea09e172387d5a8fd78a1a819766c4f"} Dec 01 20:16:13 crc kubenswrapper[4802]: I1201 20:16:13.209632 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a6a6f4e1-7593-427b-b430-42c7b351e652","Type":"ContainerStarted","Data":"b08d28cf818079fcd286d9cd40867d647ea9b59711c606f1c8313dbc5d75ea73"} Dec 01 20:16:13 crc kubenswrapper[4802]: I1201 20:16:13.213043 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gczlr" event={"ID":"4bbe4b6e-302e-4d6d-bc17-5f35baca1067","Type":"ContainerStarted","Data":"0d8b484085bbc1f75bf5c0443be9657650be7df0d955d425cd3ba9ab517c514a"} Dec 01 20:16:13 crc kubenswrapper[4802]: I1201 20:16:13.214822 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"06e3c630-6e2f-4fde-96ac-feea509e3dcb","Type":"ContainerStarted","Data":"648b882706b0f37184d23ed13c76106946dfaac34a313a42df0efa3a3369b394"} Dec 01 20:16:13 crc kubenswrapper[4802]: I1201 20:16:13.217474 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-j9thx" event={"ID":"15107e62-7679-460c-ab0e-f208b4a1ec76","Type":"ContainerStarted","Data":"561b9029b169296b73d2dde04038ae5b0b42d3161f2fb04a94f90908f3f9a638"} Dec 01 20:16:13 crc kubenswrapper[4802]: I1201 20:16:13.219412 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-dwwkt" Dec 01 20:16:13 crc kubenswrapper[4802]: I1201 20:16:13.219440 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-dwwkt" event={"ID":"f91bd0d2-43a6-4de9-a566-3894686f1da3","Type":"ContainerDied","Data":"3db66135b9a0a4f31d8bf2d73347fe436a6cc4f3d7240c0fcb61cbe71878792e"} Dec 01 20:16:13 crc kubenswrapper[4802]: I1201 20:16:13.222517 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"00254b08-a75a-4965-8b19-f4bc8ebf6f52","Type":"ContainerStarted","Data":"8b3dfac31eed04d0e7fa0c6d2e4dc6fdc3e086bb288c0ca8ebdf47c8432ad5f2"} Dec 01 20:16:13 crc kubenswrapper[4802]: I1201 20:16:13.230038 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0f360e58-7047-4369-a8c8-4e0394586f62","Type":"ContainerStarted","Data":"cccac5e8ee0f5b7a9f0a84d3ab1e4588079a03513d9c2346ba3e895ca0f4bb5f"} Dec 01 20:16:13 crc kubenswrapper[4802]: I1201 20:16:13.248859 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-4hkc5" event={"ID":"7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6","Type":"ContainerDied","Data":"4d066aad7eb61bd984e0e2ed157f36c23ab7dc7b733d2db53d4269494ede462d"} Dec 01 20:16:13 crc kubenswrapper[4802]: I1201 20:16:13.249098 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4hkc5" Dec 01 20:16:13 crc kubenswrapper[4802]: I1201 20:16:13.313648 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dwwkt"] Dec 01 20:16:13 crc kubenswrapper[4802]: I1201 20:16:13.329385 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dwwkt"] Dec 01 20:16:13 crc kubenswrapper[4802]: I1201 20:16:13.344704 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4hkc5"] Dec 01 20:16:13 crc kubenswrapper[4802]: I1201 20:16:13.351963 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4hkc5"] Dec 01 20:16:13 crc kubenswrapper[4802]: W1201 20:16:13.450575 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode808d11c_2f7d_44fd_a3f5_e30d3c3c9cb9.slice/crio-8071f48ec7615fa891127c20361c6dd84771119e0aff790e1bc7a8aa9866188e WatchSource:0}: Error finding container 8071f48ec7615fa891127c20361c6dd84771119e0aff790e1bc7a8aa9866188e: Status 404 returned error can't find the container with id 8071f48ec7615fa891127c20361c6dd84771119e0aff790e1bc7a8aa9866188e Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.262102 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9","Type":"ContainerStarted","Data":"8071f48ec7615fa891127c20361c6dd84771119e0aff790e1bc7a8aa9866188e"} Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.562002 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-6p8lx"] Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.563820 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6p8lx" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.572404 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.580342 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6p8lx"] Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.758126 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8140ed-9737-48ea-a0ea-15003dd90986-combined-ca-bundle\") pod \"ovn-controller-metrics-6p8lx\" (UID: \"dd8140ed-9737-48ea-a0ea-15003dd90986\") " pod="openstack/ovn-controller-metrics-6p8lx" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.758235 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/dd8140ed-9737-48ea-a0ea-15003dd90986-ovs-rundir\") pod \"ovn-controller-metrics-6p8lx\" (UID: \"dd8140ed-9737-48ea-a0ea-15003dd90986\") " pod="openstack/ovn-controller-metrics-6p8lx" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.758432 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrkp5\" (UniqueName: \"kubernetes.io/projected/dd8140ed-9737-48ea-a0ea-15003dd90986-kube-api-access-wrkp5\") pod \"ovn-controller-metrics-6p8lx\" (UID: \"dd8140ed-9737-48ea-a0ea-15003dd90986\") " pod="openstack/ovn-controller-metrics-6p8lx" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.758548 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd8140ed-9737-48ea-a0ea-15003dd90986-config\") pod \"ovn-controller-metrics-6p8lx\" (UID: \"dd8140ed-9737-48ea-a0ea-15003dd90986\") " pod="openstack/ovn-controller-metrics-6p8lx" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.758772 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/dd8140ed-9737-48ea-a0ea-15003dd90986-ovn-rundir\") pod \"ovn-controller-metrics-6p8lx\" (UID: \"dd8140ed-9737-48ea-a0ea-15003dd90986\") " pod="openstack/ovn-controller-metrics-6p8lx" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.758896 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8140ed-9737-48ea-a0ea-15003dd90986-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6p8lx\" (UID: \"dd8140ed-9737-48ea-a0ea-15003dd90986\") " pod="openstack/ovn-controller-metrics-6p8lx" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.761772 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6" path="/var/lib/kubelet/pods/7e4f2ce2-fc00-47aa-a0ac-3f7f9aa928c6/volumes" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.762293 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f91bd0d2-43a6-4de9-a566-3894686f1da3" path="/var/lib/kubelet/pods/f91bd0d2-43a6-4de9-a566-3894686f1da3/volumes" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.762691 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t92k5"] Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.762739 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-6hncq"] Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.764107 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-6hncq" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.770314 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.775797 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-6hncq"] Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.861032 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8140ed-9737-48ea-a0ea-15003dd90986-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6p8lx\" (UID: \"dd8140ed-9737-48ea-a0ea-15003dd90986\") " pod="openstack/ovn-controller-metrics-6p8lx" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.861098 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8140ed-9737-48ea-a0ea-15003dd90986-combined-ca-bundle\") pod \"ovn-controller-metrics-6p8lx\" (UID: \"dd8140ed-9737-48ea-a0ea-15003dd90986\") " pod="openstack/ovn-controller-metrics-6p8lx" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.861137 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f3560a7-a7f9-4497-844f-360796f6ece8-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-6hncq\" (UID: \"5f3560a7-a7f9-4497-844f-360796f6ece8\") " pod="openstack/dnsmasq-dns-7fd796d7df-6hncq" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.861175 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/dd8140ed-9737-48ea-a0ea-15003dd90986-ovs-rundir\") pod \"ovn-controller-metrics-6p8lx\" (UID: \"dd8140ed-9737-48ea-a0ea-15003dd90986\") " pod="openstack/ovn-controller-metrics-6p8lx" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.861218 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrkp5\" (UniqueName: \"kubernetes.io/projected/dd8140ed-9737-48ea-a0ea-15003dd90986-kube-api-access-wrkp5\") pod \"ovn-controller-metrics-6p8lx\" (UID: \"dd8140ed-9737-48ea-a0ea-15003dd90986\") " pod="openstack/ovn-controller-metrics-6p8lx" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.861246 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd8140ed-9737-48ea-a0ea-15003dd90986-config\") pod \"ovn-controller-metrics-6p8lx\" (UID: \"dd8140ed-9737-48ea-a0ea-15003dd90986\") " pod="openstack/ovn-controller-metrics-6p8lx" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.861276 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dskx9\" (UniqueName: \"kubernetes.io/projected/5f3560a7-a7f9-4497-844f-360796f6ece8-kube-api-access-dskx9\") pod \"dnsmasq-dns-7fd796d7df-6hncq\" (UID: \"5f3560a7-a7f9-4497-844f-360796f6ece8\") " pod="openstack/dnsmasq-dns-7fd796d7df-6hncq" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.861308 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f3560a7-a7f9-4497-844f-360796f6ece8-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-6hncq\" (UID: \"5f3560a7-a7f9-4497-844f-360796f6ece8\") " pod="openstack/dnsmasq-dns-7fd796d7df-6hncq" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.861324 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f3560a7-a7f9-4497-844f-360796f6ece8-config\") pod \"dnsmasq-dns-7fd796d7df-6hncq\" (UID: \"5f3560a7-a7f9-4497-844f-360796f6ece8\") " pod="openstack/dnsmasq-dns-7fd796d7df-6hncq" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.861347 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/dd8140ed-9737-48ea-a0ea-15003dd90986-ovn-rundir\") pod \"ovn-controller-metrics-6p8lx\" (UID: \"dd8140ed-9737-48ea-a0ea-15003dd90986\") " pod="openstack/ovn-controller-metrics-6p8lx" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.861765 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/dd8140ed-9737-48ea-a0ea-15003dd90986-ovn-rundir\") pod \"ovn-controller-metrics-6p8lx\" (UID: \"dd8140ed-9737-48ea-a0ea-15003dd90986\") " pod="openstack/ovn-controller-metrics-6p8lx" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.862451 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd8140ed-9737-48ea-a0ea-15003dd90986-config\") pod \"ovn-controller-metrics-6p8lx\" (UID: \"dd8140ed-9737-48ea-a0ea-15003dd90986\") " pod="openstack/ovn-controller-metrics-6p8lx" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.862515 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/dd8140ed-9737-48ea-a0ea-15003dd90986-ovs-rundir\") pod \"ovn-controller-metrics-6p8lx\" (UID: \"dd8140ed-9737-48ea-a0ea-15003dd90986\") " pod="openstack/ovn-controller-metrics-6p8lx" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.873288 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8140ed-9737-48ea-a0ea-15003dd90986-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6p8lx\" (UID: \"dd8140ed-9737-48ea-a0ea-15003dd90986\") " pod="openstack/ovn-controller-metrics-6p8lx" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.873433 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8140ed-9737-48ea-a0ea-15003dd90986-combined-ca-bundle\") pod \"ovn-controller-metrics-6p8lx\" (UID: \"dd8140ed-9737-48ea-a0ea-15003dd90986\") " pod="openstack/ovn-controller-metrics-6p8lx" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.893241 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrkp5\" (UniqueName: \"kubernetes.io/projected/dd8140ed-9737-48ea-a0ea-15003dd90986-kube-api-access-wrkp5\") pod \"ovn-controller-metrics-6p8lx\" (UID: \"dd8140ed-9737-48ea-a0ea-15003dd90986\") " pod="openstack/ovn-controller-metrics-6p8lx" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.961441 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-qgtwk"] Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.963322 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f3560a7-a7f9-4497-844f-360796f6ece8-config\") pod \"dnsmasq-dns-7fd796d7df-6hncq\" (UID: \"5f3560a7-a7f9-4497-844f-360796f6ece8\") " pod="openstack/dnsmasq-dns-7fd796d7df-6hncq" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.963417 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f3560a7-a7f9-4497-844f-360796f6ece8-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-6hncq\" (UID: \"5f3560a7-a7f9-4497-844f-360796f6ece8\") " pod="openstack/dnsmasq-dns-7fd796d7df-6hncq" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.963503 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dskx9\" (UniqueName: \"kubernetes.io/projected/5f3560a7-a7f9-4497-844f-360796f6ece8-kube-api-access-dskx9\") pod \"dnsmasq-dns-7fd796d7df-6hncq\" (UID: \"5f3560a7-a7f9-4497-844f-360796f6ece8\") " pod="openstack/dnsmasq-dns-7fd796d7df-6hncq" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.963544 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f3560a7-a7f9-4497-844f-360796f6ece8-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-6hncq\" (UID: \"5f3560a7-a7f9-4497-844f-360796f6ece8\") " pod="openstack/dnsmasq-dns-7fd796d7df-6hncq" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.964430 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f3560a7-a7f9-4497-844f-360796f6ece8-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-6hncq\" (UID: \"5f3560a7-a7f9-4497-844f-360796f6ece8\") " pod="openstack/dnsmasq-dns-7fd796d7df-6hncq" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.965746 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f3560a7-a7f9-4497-844f-360796f6ece8-config\") pod \"dnsmasq-dns-7fd796d7df-6hncq\" (UID: \"5f3560a7-a7f9-4497-844f-360796f6ece8\") " pod="openstack/dnsmasq-dns-7fd796d7df-6hncq" Dec 01 20:16:14 crc kubenswrapper[4802]: I1201 20:16:14.974516 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f3560a7-a7f9-4497-844f-360796f6ece8-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-6hncq\" (UID: \"5f3560a7-a7f9-4497-844f-360796f6ece8\") " pod="openstack/dnsmasq-dns-7fd796d7df-6hncq" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.006276 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dskx9\" (UniqueName: \"kubernetes.io/projected/5f3560a7-a7f9-4497-844f-360796f6ece8-kube-api-access-dskx9\") pod \"dnsmasq-dns-7fd796d7df-6hncq\" (UID: \"5f3560a7-a7f9-4497-844f-360796f6ece8\") " pod="openstack/dnsmasq-dns-7fd796d7df-6hncq" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.017694 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-kw24b"] Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.019141 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.024692 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.026788 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-kw24b"] Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.096532 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-6hncq" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.167310 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2fzr\" (UniqueName: \"kubernetes.io/projected/415afd0d-1968-4f65-b7ed-6d4acbde81c9-kube-api-access-z2fzr\") pod \"dnsmasq-dns-86db49b7ff-kw24b\" (UID: \"415afd0d-1968-4f65-b7ed-6d4acbde81c9\") " pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.168607 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/415afd0d-1968-4f65-b7ed-6d4acbde81c9-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-kw24b\" (UID: \"415afd0d-1968-4f65-b7ed-6d4acbde81c9\") " pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.168718 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/415afd0d-1968-4f65-b7ed-6d4acbde81c9-config\") pod \"dnsmasq-dns-86db49b7ff-kw24b\" (UID: \"415afd0d-1968-4f65-b7ed-6d4acbde81c9\") " pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.168778 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/415afd0d-1968-4f65-b7ed-6d4acbde81c9-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-kw24b\" (UID: \"415afd0d-1968-4f65-b7ed-6d4acbde81c9\") " pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.168846 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/415afd0d-1968-4f65-b7ed-6d4acbde81c9-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-kw24b\" (UID: \"415afd0d-1968-4f65-b7ed-6d4acbde81c9\") " pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.191004 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6p8lx" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.270626 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/415afd0d-1968-4f65-b7ed-6d4acbde81c9-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-kw24b\" (UID: \"415afd0d-1968-4f65-b7ed-6d4acbde81c9\") " pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.270730 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/415afd0d-1968-4f65-b7ed-6d4acbde81c9-config\") pod \"dnsmasq-dns-86db49b7ff-kw24b\" (UID: \"415afd0d-1968-4f65-b7ed-6d4acbde81c9\") " pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.270772 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/415afd0d-1968-4f65-b7ed-6d4acbde81c9-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-kw24b\" (UID: \"415afd0d-1968-4f65-b7ed-6d4acbde81c9\") " pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.270810 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/415afd0d-1968-4f65-b7ed-6d4acbde81c9-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-kw24b\" (UID: \"415afd0d-1968-4f65-b7ed-6d4acbde81c9\") " pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.270875 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2fzr\" (UniqueName: \"kubernetes.io/projected/415afd0d-1968-4f65-b7ed-6d4acbde81c9-kube-api-access-z2fzr\") pod \"dnsmasq-dns-86db49b7ff-kw24b\" (UID: \"415afd0d-1968-4f65-b7ed-6d4acbde81c9\") " pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.271896 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/415afd0d-1968-4f65-b7ed-6d4acbde81c9-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-kw24b\" (UID: \"415afd0d-1968-4f65-b7ed-6d4acbde81c9\") " pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.271898 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/415afd0d-1968-4f65-b7ed-6d4acbde81c9-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-kw24b\" (UID: \"415afd0d-1968-4f65-b7ed-6d4acbde81c9\") " pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.272249 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/415afd0d-1968-4f65-b7ed-6d4acbde81c9-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-kw24b\" (UID: \"415afd0d-1968-4f65-b7ed-6d4acbde81c9\") " pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.273362 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/415afd0d-1968-4f65-b7ed-6d4acbde81c9-config\") pod \"dnsmasq-dns-86db49b7ff-kw24b\" (UID: \"415afd0d-1968-4f65-b7ed-6d4acbde81c9\") " pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.294007 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2fzr\" (UniqueName: \"kubernetes.io/projected/415afd0d-1968-4f65-b7ed-6d4acbde81c9-kube-api-access-z2fzr\") pod \"dnsmasq-dns-86db49b7ff-kw24b\" (UID: \"415afd0d-1968-4f65-b7ed-6d4acbde81c9\") " pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.371256 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.802162 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-t92k5" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.811032 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-qgtwk" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.881060 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34c04a7f-117a-421d-8cee-10446cb66fa3-config\") pod \"34c04a7f-117a-421d-8cee-10446cb66fa3\" (UID: \"34c04a7f-117a-421d-8cee-10446cb66fa3\") " Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.881142 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbfc7a1b-086f-4868-817a-61710c05dae0-config\") pod \"bbfc7a1b-086f-4868-817a-61710c05dae0\" (UID: \"bbfc7a1b-086f-4868-817a-61710c05dae0\") " Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.881347 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbfc7a1b-086f-4868-817a-61710c05dae0-dns-svc\") pod \"bbfc7a1b-086f-4868-817a-61710c05dae0\" (UID: \"bbfc7a1b-086f-4868-817a-61710c05dae0\") " Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.881379 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34c04a7f-117a-421d-8cee-10446cb66fa3-dns-svc\") pod \"34c04a7f-117a-421d-8cee-10446cb66fa3\" (UID: \"34c04a7f-117a-421d-8cee-10446cb66fa3\") " Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.881469 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfdbs\" (UniqueName: \"kubernetes.io/projected/34c04a7f-117a-421d-8cee-10446cb66fa3-kube-api-access-nfdbs\") pod \"34c04a7f-117a-421d-8cee-10446cb66fa3\" (UID: \"34c04a7f-117a-421d-8cee-10446cb66fa3\") " Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.881505 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2lt5\" (UniqueName: \"kubernetes.io/projected/bbfc7a1b-086f-4868-817a-61710c05dae0-kube-api-access-d2lt5\") pod \"bbfc7a1b-086f-4868-817a-61710c05dae0\" (UID: \"bbfc7a1b-086f-4868-817a-61710c05dae0\") " Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.881540 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34c04a7f-117a-421d-8cee-10446cb66fa3-config" (OuterVolumeSpecName: "config") pod "34c04a7f-117a-421d-8cee-10446cb66fa3" (UID: "34c04a7f-117a-421d-8cee-10446cb66fa3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.882004 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbfc7a1b-086f-4868-817a-61710c05dae0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bbfc7a1b-086f-4868-817a-61710c05dae0" (UID: "bbfc7a1b-086f-4868-817a-61710c05dae0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.882028 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34c04a7f-117a-421d-8cee-10446cb66fa3-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.882228 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34c04a7f-117a-421d-8cee-10446cb66fa3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "34c04a7f-117a-421d-8cee-10446cb66fa3" (UID: "34c04a7f-117a-421d-8cee-10446cb66fa3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.882305 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbfc7a1b-086f-4868-817a-61710c05dae0-config" (OuterVolumeSpecName: "config") pod "bbfc7a1b-086f-4868-817a-61710c05dae0" (UID: "bbfc7a1b-086f-4868-817a-61710c05dae0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.886383 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34c04a7f-117a-421d-8cee-10446cb66fa3-kube-api-access-nfdbs" (OuterVolumeSpecName: "kube-api-access-nfdbs") pod "34c04a7f-117a-421d-8cee-10446cb66fa3" (UID: "34c04a7f-117a-421d-8cee-10446cb66fa3"). InnerVolumeSpecName "kube-api-access-nfdbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.889306 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbfc7a1b-086f-4868-817a-61710c05dae0-kube-api-access-d2lt5" (OuterVolumeSpecName: "kube-api-access-d2lt5") pod "bbfc7a1b-086f-4868-817a-61710c05dae0" (UID: "bbfc7a1b-086f-4868-817a-61710c05dae0"). InnerVolumeSpecName "kube-api-access-d2lt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.983329 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfdbs\" (UniqueName: \"kubernetes.io/projected/34c04a7f-117a-421d-8cee-10446cb66fa3-kube-api-access-nfdbs\") on node \"crc\" DevicePath \"\"" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.983370 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2lt5\" (UniqueName: \"kubernetes.io/projected/bbfc7a1b-086f-4868-817a-61710c05dae0-kube-api-access-d2lt5\") on node \"crc\" DevicePath \"\"" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.983386 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbfc7a1b-086f-4868-817a-61710c05dae0-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.983400 4802 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbfc7a1b-086f-4868-817a-61710c05dae0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 20:16:15 crc kubenswrapper[4802]: I1201 20:16:15.983413 4802 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34c04a7f-117a-421d-8cee-10446cb66fa3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 20:16:16 crc kubenswrapper[4802]: I1201 20:16:16.285436 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-qgtwk" event={"ID":"34c04a7f-117a-421d-8cee-10446cb66fa3","Type":"ContainerDied","Data":"ff4f2a359fd96059990646df314186f660f13da3b6e9cd8372b6d9cf1508b216"} Dec 01 20:16:16 crc kubenswrapper[4802]: I1201 20:16:16.286139 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-qgtwk" Dec 01 20:16:16 crc kubenswrapper[4802]: I1201 20:16:16.288248 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-t92k5" event={"ID":"bbfc7a1b-086f-4868-817a-61710c05dae0","Type":"ContainerDied","Data":"510fd88f834d7be845e7d9041a83e1ecba11c5293615fb0b5ad18ff83fd6d86b"} Dec 01 20:16:16 crc kubenswrapper[4802]: I1201 20:16:16.288365 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-t92k5" Dec 01 20:16:16 crc kubenswrapper[4802]: I1201 20:16:16.354000 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-qgtwk"] Dec 01 20:16:16 crc kubenswrapper[4802]: I1201 20:16:16.366644 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-qgtwk"] Dec 01 20:16:16 crc kubenswrapper[4802]: I1201 20:16:16.383154 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t92k5"] Dec 01 20:16:16 crc kubenswrapper[4802]: I1201 20:16:16.390433 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t92k5"] Dec 01 20:16:16 crc kubenswrapper[4802]: I1201 20:16:16.733696 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34c04a7f-117a-421d-8cee-10446cb66fa3" path="/var/lib/kubelet/pods/34c04a7f-117a-421d-8cee-10446cb66fa3/volumes" Dec 01 20:16:16 crc kubenswrapper[4802]: I1201 20:16:16.734343 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbfc7a1b-086f-4868-817a-61710c05dae0" path="/var/lib/kubelet/pods/bbfc7a1b-086f-4868-817a-61710c05dae0/volumes" Dec 01 20:16:22 crc kubenswrapper[4802]: I1201 20:16:22.785048 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-6hncq"] Dec 01 20:16:22 crc kubenswrapper[4802]: I1201 20:16:22.849890 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-kw24b"] Dec 01 20:16:22 crc kubenswrapper[4802]: I1201 20:16:22.914571 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6p8lx"] Dec 01 20:16:22 crc kubenswrapper[4802]: W1201 20:16:22.937274 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd8140ed_9737_48ea_a0ea_15003dd90986.slice/crio-ee0bdeac70fc99a52b0cea781ed741af106a0cdbc4eba8b62fb7205a8a1e6e42 WatchSource:0}: Error finding container ee0bdeac70fc99a52b0cea781ed741af106a0cdbc4eba8b62fb7205a8a1e6e42: Status 404 returned error can't find the container with id ee0bdeac70fc99a52b0cea781ed741af106a0cdbc4eba8b62fb7205a8a1e6e42 Dec 01 20:16:23 crc kubenswrapper[4802]: I1201 20:16:23.339714 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0f360e58-7047-4369-a8c8-4e0394586f62","Type":"ContainerStarted","Data":"7ffd4c0a748c136958a0655dd6dcd1aa6d2a9d6b5ba16ad3c949dd92cc8d0a2c"} Dec 01 20:16:23 crc kubenswrapper[4802]: I1201 20:16:23.341488 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-6hncq" event={"ID":"5f3560a7-a7f9-4497-844f-360796f6ece8","Type":"ContainerStarted","Data":"a8ca37e3a4e3eea09819804a8ab0bf1934ae4f63c7475e5e5da1284115cc64c9"} Dec 01 20:16:23 crc kubenswrapper[4802]: I1201 20:16:23.343432 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-j9thx" event={"ID":"15107e62-7679-460c-ab0e-f208b4a1ec76","Type":"ContainerStarted","Data":"79d87d739da4128d9d3df02e39172948caf59759e01dfcb300a952c5a149c417"} Dec 01 20:16:23 crc kubenswrapper[4802]: I1201 20:16:23.345882 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c","Type":"ContainerStarted","Data":"fa7654236e2c72a6be87a47f12957e0049f367e9b075c4cfff358d9a489dc4ac"} Dec 01 20:16:23 crc kubenswrapper[4802]: I1201 20:16:23.348011 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"00254b08-a75a-4965-8b19-f4bc8ebf6f52","Type":"ContainerStarted","Data":"fe97cc493715ea510d20838f2b468a1464f352dddb2ed049121628ea9985dca6"} Dec 01 20:16:23 crc kubenswrapper[4802]: I1201 20:16:23.350286 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" event={"ID":"415afd0d-1968-4f65-b7ed-6d4acbde81c9","Type":"ContainerStarted","Data":"c1190ca03c369d4bfe2f2e9605bea238c993267b04bfa0768a34b1505f1fc8b0"} Dec 01 20:16:23 crc kubenswrapper[4802]: I1201 20:16:23.351444 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6p8lx" event={"ID":"dd8140ed-9737-48ea-a0ea-15003dd90986","Type":"ContainerStarted","Data":"ee0bdeac70fc99a52b0cea781ed741af106a0cdbc4eba8b62fb7205a8a1e6e42"} Dec 01 20:16:23 crc kubenswrapper[4802]: I1201 20:16:23.353308 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"06e3c630-6e2f-4fde-96ac-feea509e3dcb","Type":"ContainerStarted","Data":"fe7122d32416f548f56af333361dd277f18a06b4af68c45189a085d894b385af"} Dec 01 20:16:23 crc kubenswrapper[4802]: I1201 20:16:23.353667 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 01 20:16:23 crc kubenswrapper[4802]: I1201 20:16:23.441386 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=29.286585064 podStartE2EDuration="38.441364155s" podCreationTimestamp="2025-12-01 20:15:45 +0000 UTC" firstStartedPulling="2025-12-01 20:16:12.551441439 +0000 UTC m=+1194.114001080" lastFinishedPulling="2025-12-01 20:16:21.70622053 +0000 UTC m=+1203.268780171" observedRunningTime="2025-12-01 20:16:23.424628289 +0000 UTC m=+1204.987187930" watchObservedRunningTime="2025-12-01 20:16:23.441364155 +0000 UTC m=+1205.003923796" Dec 01 20:16:24 crc kubenswrapper[4802]: I1201 20:16:24.411296 4802 generic.go:334] "Generic (PLEG): container finished" podID="415afd0d-1968-4f65-b7ed-6d4acbde81c9" containerID="f7c0d46599dfdc92d7ba66dfcaa0ecd5c465ab93ef2bbce2390a7d6f4124b832" exitCode=0 Dec 01 20:16:24 crc kubenswrapper[4802]: I1201 20:16:24.412324 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" event={"ID":"415afd0d-1968-4f65-b7ed-6d4acbde81c9","Type":"ContainerDied","Data":"f7c0d46599dfdc92d7ba66dfcaa0ecd5c465ab93ef2bbce2390a7d6f4124b832"} Dec 01 20:16:24 crc kubenswrapper[4802]: I1201 20:16:24.433364 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-6hncq" event={"ID":"5f3560a7-a7f9-4497-844f-360796f6ece8","Type":"ContainerDied","Data":"00b584710b64dc43a7acec553e2a2c02920a149b1b6b1dcabed672e56321e9ee"} Dec 01 20:16:24 crc kubenswrapper[4802]: I1201 20:16:24.433524 4802 generic.go:334] "Generic (PLEG): container finished" podID="5f3560a7-a7f9-4497-844f-360796f6ece8" containerID="00b584710b64dc43a7acec553e2a2c02920a149b1b6b1dcabed672e56321e9ee" exitCode=0 Dec 01 20:16:24 crc kubenswrapper[4802]: I1201 20:16:24.449855 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gczlr" event={"ID":"4bbe4b6e-302e-4d6d-bc17-5f35baca1067","Type":"ContainerStarted","Data":"010493d97237b0b435426e718cba20278eac10c06b599624d4209168b8b86277"} Dec 01 20:16:24 crc kubenswrapper[4802]: I1201 20:16:24.450145 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-gczlr" Dec 01 20:16:24 crc kubenswrapper[4802]: I1201 20:16:24.466607 4802 generic.go:334] "Generic (PLEG): container finished" podID="15107e62-7679-460c-ab0e-f208b4a1ec76" containerID="79d87d739da4128d9d3df02e39172948caf59759e01dfcb300a952c5a149c417" exitCode=0 Dec 01 20:16:24 crc kubenswrapper[4802]: I1201 20:16:24.466679 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-j9thx" event={"ID":"15107e62-7679-460c-ab0e-f208b4a1ec76","Type":"ContainerDied","Data":"79d87d739da4128d9d3df02e39172948caf59759e01dfcb300a952c5a149c417"} Dec 01 20:16:24 crc kubenswrapper[4802]: I1201 20:16:24.474898 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9","Type":"ContainerStarted","Data":"6236123ea1633202f5fc5405502619f6d9f8ec1770ba0f46bb49b52e7c9ef7cc"} Dec 01 20:16:24 crc kubenswrapper[4802]: I1201 20:16:24.490265 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"97e35ed2-d0e5-4f29-8869-3740e22f5cd9","Type":"ContainerStarted","Data":"957695bc3f9738b3ae79ad36eac3d42d9c4d14740c1e9708f1951ad24b1d393f"} Dec 01 20:16:24 crc kubenswrapper[4802]: I1201 20:16:24.501956 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a6a6f4e1-7593-427b-b430-42c7b351e652","Type":"ContainerStarted","Data":"aac8775d6558175e6be41fa3d1e1780d3fc548d1fbafac8b04704cd3b52fbd36"} Dec 01 20:16:24 crc kubenswrapper[4802]: I1201 20:16:24.502213 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 01 20:16:24 crc kubenswrapper[4802]: I1201 20:16:24.555278 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-gczlr" podStartSLOduration=24.067865384 podStartE2EDuration="33.555251907s" podCreationTimestamp="2025-12-01 20:15:51 +0000 UTC" firstStartedPulling="2025-12-01 20:16:12.524679198 +0000 UTC m=+1194.087238839" lastFinishedPulling="2025-12-01 20:16:22.012065721 +0000 UTC m=+1203.574625362" observedRunningTime="2025-12-01 20:16:24.540962157 +0000 UTC m=+1206.103521808" watchObservedRunningTime="2025-12-01 20:16:24.555251907 +0000 UTC m=+1206.117811558" Dec 01 20:16:24 crc kubenswrapper[4802]: I1201 20:16:24.582449 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=27.482269762 podStartE2EDuration="37.582424493s" podCreationTimestamp="2025-12-01 20:15:47 +0000 UTC" firstStartedPulling="2025-12-01 20:16:12.386642255 +0000 UTC m=+1193.949201896" lastFinishedPulling="2025-12-01 20:16:22.486796976 +0000 UTC m=+1204.049356627" observedRunningTime="2025-12-01 20:16:24.57058456 +0000 UTC m=+1206.133144211" watchObservedRunningTime="2025-12-01 20:16:24.582424493 +0000 UTC m=+1206.144984154" Dec 01 20:16:25 crc kubenswrapper[4802]: I1201 20:16:25.599043 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" event={"ID":"415afd0d-1968-4f65-b7ed-6d4acbde81c9","Type":"ContainerStarted","Data":"4e9df40859e0444f99fb424b38a9c4c02ab36d8eff9728b2df85e6162ce4ed5d"} Dec 01 20:16:25 crc kubenswrapper[4802]: I1201 20:16:25.600574 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" Dec 01 20:16:25 crc kubenswrapper[4802]: I1201 20:16:25.604701 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-6hncq" event={"ID":"5f3560a7-a7f9-4497-844f-360796f6ece8","Type":"ContainerStarted","Data":"b9bf9a05bbf9223f5d57d3f0a8f43af24c72217cdb2e020edf665fc52573e0f9"} Dec 01 20:16:25 crc kubenswrapper[4802]: I1201 20:16:25.605362 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-6hncq" Dec 01 20:16:25 crc kubenswrapper[4802]: I1201 20:16:25.606764 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"563ae8fc-e33c-402e-8901-79434cf68179","Type":"ContainerStarted","Data":"cfe5292ebff6919447e6cb75a9e6f7f38ca2cf0cfc55c891c7899f5e93c7c88a"} Dec 01 20:16:25 crc kubenswrapper[4802]: I1201 20:16:25.611920 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-j9thx" event={"ID":"15107e62-7679-460c-ab0e-f208b4a1ec76","Type":"ContainerStarted","Data":"1606630be35b465d435018f15eb0373c8cfa30da0b18b1e493315cf39bc7e11b"} Dec 01 20:16:25 crc kubenswrapper[4802]: I1201 20:16:25.611989 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-j9thx" event={"ID":"15107e62-7679-460c-ab0e-f208b4a1ec76","Type":"ContainerStarted","Data":"308dd0117ca6b4e8741c16188cc5eda7b981917c97ed0084e3d38b85482134a4"} Dec 01 20:16:25 crc kubenswrapper[4802]: I1201 20:16:25.612223 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-j9thx" Dec 01 20:16:25 crc kubenswrapper[4802]: I1201 20:16:25.612282 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-j9thx" Dec 01 20:16:25 crc kubenswrapper[4802]: I1201 20:16:25.632391 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" podStartSLOduration=10.897499964 podStartE2EDuration="11.632374263s" podCreationTimestamp="2025-12-01 20:16:14 +0000 UTC" firstStartedPulling="2025-12-01 20:16:22.859036046 +0000 UTC m=+1204.421595687" lastFinishedPulling="2025-12-01 20:16:23.593910345 +0000 UTC m=+1205.156469986" observedRunningTime="2025-12-01 20:16:25.627586871 +0000 UTC m=+1207.190146512" watchObservedRunningTime="2025-12-01 20:16:25.632374263 +0000 UTC m=+1207.194933904" Dec 01 20:16:25 crc kubenswrapper[4802]: I1201 20:16:25.702451 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-j9thx" podStartSLOduration=25.502426114 podStartE2EDuration="34.702431847s" podCreationTimestamp="2025-12-01 20:15:51 +0000 UTC" firstStartedPulling="2025-12-01 20:16:12.809379424 +0000 UTC m=+1194.371939065" lastFinishedPulling="2025-12-01 20:16:22.009385157 +0000 UTC m=+1203.571944798" observedRunningTime="2025-12-01 20:16:25.697022277 +0000 UTC m=+1207.259581918" watchObservedRunningTime="2025-12-01 20:16:25.702431847 +0000 UTC m=+1207.264991488" Dec 01 20:16:25 crc kubenswrapper[4802]: I1201 20:16:25.753935 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-6hncq" podStartSLOduration=11.253942117 podStartE2EDuration="11.753918386s" podCreationTimestamp="2025-12-01 20:16:14 +0000 UTC" firstStartedPulling="2025-12-01 20:16:22.800144743 +0000 UTC m=+1204.362704374" lastFinishedPulling="2025-12-01 20:16:23.300121002 +0000 UTC m=+1204.862680643" observedRunningTime="2025-12-01 20:16:25.75245068 +0000 UTC m=+1207.315010331" watchObservedRunningTime="2025-12-01 20:16:25.753918386 +0000 UTC m=+1207.316478027" Dec 01 20:16:28 crc kubenswrapper[4802]: I1201 20:16:28.679352 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9","Type":"ContainerStarted","Data":"5bf431890b1ed141f167a18baf6a63cc275935eb59ba9fc97fa4e87ebaca260d"} Dec 01 20:16:28 crc kubenswrapper[4802]: I1201 20:16:28.682057 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c","Type":"ContainerStarted","Data":"c9319519a18fa194bf609735eed85c8042bd2796fd5e595c8a1fe3ec7165289a"} Dec 01 20:16:28 crc kubenswrapper[4802]: I1201 20:16:28.685335 4802 generic.go:334] "Generic (PLEG): container finished" podID="00254b08-a75a-4965-8b19-f4bc8ebf6f52" containerID="fe97cc493715ea510d20838f2b468a1464f352dddb2ed049121628ea9985dca6" exitCode=0 Dec 01 20:16:28 crc kubenswrapper[4802]: I1201 20:16:28.685435 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"00254b08-a75a-4965-8b19-f4bc8ebf6f52","Type":"ContainerDied","Data":"fe97cc493715ea510d20838f2b468a1464f352dddb2ed049121628ea9985dca6"} Dec 01 20:16:28 crc kubenswrapper[4802]: I1201 20:16:28.687498 4802 generic.go:334] "Generic (PLEG): container finished" podID="0f360e58-7047-4369-a8c8-4e0394586f62" containerID="7ffd4c0a748c136958a0655dd6dcd1aa6d2a9d6b5ba16ad3c949dd92cc8d0a2c" exitCode=0 Dec 01 20:16:28 crc kubenswrapper[4802]: I1201 20:16:28.687542 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0f360e58-7047-4369-a8c8-4e0394586f62","Type":"ContainerDied","Data":"7ffd4c0a748c136958a0655dd6dcd1aa6d2a9d6b5ba16ad3c949dd92cc8d0a2c"} Dec 01 20:16:28 crc kubenswrapper[4802]: I1201 20:16:28.688869 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6p8lx" event={"ID":"dd8140ed-9737-48ea-a0ea-15003dd90986","Type":"ContainerStarted","Data":"7dfd9dc4f653dfc1591da0cf64dfcb2b51196c4df8fd95d71b8009810f9b4872"} Dec 01 20:16:28 crc kubenswrapper[4802]: I1201 20:16:28.708877 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=20.981490657 podStartE2EDuration="35.708856036s" podCreationTimestamp="2025-12-01 20:15:53 +0000 UTC" firstStartedPulling="2025-12-01 20:16:13.454653813 +0000 UTC m=+1195.017213454" lastFinishedPulling="2025-12-01 20:16:28.182019172 +0000 UTC m=+1209.744578833" observedRunningTime="2025-12-01 20:16:28.698427027 +0000 UTC m=+1210.260986688" watchObservedRunningTime="2025-12-01 20:16:28.708856036 +0000 UTC m=+1210.271415667" Dec 01 20:16:28 crc kubenswrapper[4802]: I1201 20:16:28.730599 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=23.153935884 podStartE2EDuration="38.730574239s" podCreationTimestamp="2025-12-01 20:15:50 +0000 UTC" firstStartedPulling="2025-12-01 20:16:12.641243415 +0000 UTC m=+1194.203803056" lastFinishedPulling="2025-12-01 20:16:28.21788177 +0000 UTC m=+1209.780441411" observedRunningTime="2025-12-01 20:16:28.726007135 +0000 UTC m=+1210.288566776" watchObservedRunningTime="2025-12-01 20:16:28.730574239 +0000 UTC m=+1210.293133880" Dec 01 20:16:28 crc kubenswrapper[4802]: I1201 20:16:28.761135 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-6p8lx" podStartSLOduration=9.461173609 podStartE2EDuration="14.761116239s" podCreationTimestamp="2025-12-01 20:16:14 +0000 UTC" firstStartedPulling="2025-12-01 20:16:22.940086876 +0000 UTC m=+1204.502646517" lastFinishedPulling="2025-12-01 20:16:28.240029506 +0000 UTC m=+1209.802589147" observedRunningTime="2025-12-01 20:16:28.757708422 +0000 UTC m=+1210.320268063" watchObservedRunningTime="2025-12-01 20:16:28.761116239 +0000 UTC m=+1210.323675880" Dec 01 20:16:29 crc kubenswrapper[4802]: I1201 20:16:29.698851 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0f360e58-7047-4369-a8c8-4e0394586f62","Type":"ContainerStarted","Data":"34b432806837268d98221d5ceb3208c5bf40932180706173e57a2765a86ee8ca"} Dec 01 20:16:29 crc kubenswrapper[4802]: I1201 20:16:29.700490 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"00254b08-a75a-4965-8b19-f4bc8ebf6f52","Type":"ContainerStarted","Data":"105cf7d7015f5a52cbf6d8994abd11aa96765451515fb4ba910557f7027777eb"} Dec 01 20:16:29 crc kubenswrapper[4802]: I1201 20:16:29.750646 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=37.821697555 podStartE2EDuration="47.750624859s" podCreationTimestamp="2025-12-01 20:15:42 +0000 UTC" firstStartedPulling="2025-12-01 20:16:12.559876035 +0000 UTC m=+1194.122435676" lastFinishedPulling="2025-12-01 20:16:22.488803339 +0000 UTC m=+1204.051362980" observedRunningTime="2025-12-01 20:16:29.719434967 +0000 UTC m=+1211.281994628" watchObservedRunningTime="2025-12-01 20:16:29.750624859 +0000 UTC m=+1211.313184510" Dec 01 20:16:29 crc kubenswrapper[4802]: I1201 20:16:29.762268 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=36.70902434 podStartE2EDuration="46.762249634s" podCreationTimestamp="2025-12-01 20:15:43 +0000 UTC" firstStartedPulling="2025-12-01 20:16:12.380044207 +0000 UTC m=+1193.942603848" lastFinishedPulling="2025-12-01 20:16:22.433269501 +0000 UTC m=+1203.995829142" observedRunningTime="2025-12-01 20:16:29.74525032 +0000 UTC m=+1211.307809981" watchObservedRunningTime="2025-12-01 20:16:29.762249634 +0000 UTC m=+1211.324809285" Dec 01 20:16:29 crc kubenswrapper[4802]: I1201 20:16:29.956501 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 01 20:16:30 crc kubenswrapper[4802]: I1201 20:16:30.099408 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-6hncq" Dec 01 20:16:30 crc kubenswrapper[4802]: I1201 20:16:30.374367 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" Dec 01 20:16:30 crc kubenswrapper[4802]: I1201 20:16:30.448114 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-6hncq"] Dec 01 20:16:30 crc kubenswrapper[4802]: I1201 20:16:30.627319 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 01 20:16:30 crc kubenswrapper[4802]: I1201 20:16:30.707357 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-6hncq" podUID="5f3560a7-a7f9-4497-844f-360796f6ece8" containerName="dnsmasq-dns" containerID="cri-o://b9bf9a05bbf9223f5d57d3f0a8f43af24c72217cdb2e020edf665fc52573e0f9" gracePeriod=10 Dec 01 20:16:30 crc kubenswrapper[4802]: I1201 20:16:30.957725 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.013186 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.045505 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.082582 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.169720 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-6hncq" Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.262452 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dskx9\" (UniqueName: \"kubernetes.io/projected/5f3560a7-a7f9-4497-844f-360796f6ece8-kube-api-access-dskx9\") pod \"5f3560a7-a7f9-4497-844f-360796f6ece8\" (UID: \"5f3560a7-a7f9-4497-844f-360796f6ece8\") " Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.262785 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f3560a7-a7f9-4497-844f-360796f6ece8-config\") pod \"5f3560a7-a7f9-4497-844f-360796f6ece8\" (UID: \"5f3560a7-a7f9-4497-844f-360796f6ece8\") " Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.262882 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f3560a7-a7f9-4497-844f-360796f6ece8-ovsdbserver-nb\") pod \"5f3560a7-a7f9-4497-844f-360796f6ece8\" (UID: \"5f3560a7-a7f9-4497-844f-360796f6ece8\") " Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.263049 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f3560a7-a7f9-4497-844f-360796f6ece8-dns-svc\") pod \"5f3560a7-a7f9-4497-844f-360796f6ece8\" (UID: \"5f3560a7-a7f9-4497-844f-360796f6ece8\") " Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.273581 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f3560a7-a7f9-4497-844f-360796f6ece8-kube-api-access-dskx9" (OuterVolumeSpecName: "kube-api-access-dskx9") pod "5f3560a7-a7f9-4497-844f-360796f6ece8" (UID: "5f3560a7-a7f9-4497-844f-360796f6ece8"). InnerVolumeSpecName "kube-api-access-dskx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.307328 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f3560a7-a7f9-4497-844f-360796f6ece8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5f3560a7-a7f9-4497-844f-360796f6ece8" (UID: "5f3560a7-a7f9-4497-844f-360796f6ece8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.308698 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f3560a7-a7f9-4497-844f-360796f6ece8-config" (OuterVolumeSpecName: "config") pod "5f3560a7-a7f9-4497-844f-360796f6ece8" (UID: "5f3560a7-a7f9-4497-844f-360796f6ece8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.310816 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f3560a7-a7f9-4497-844f-360796f6ece8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5f3560a7-a7f9-4497-844f-360796f6ece8" (UID: "5f3560a7-a7f9-4497-844f-360796f6ece8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.364948 4802 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f3560a7-a7f9-4497-844f-360796f6ece8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.365005 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dskx9\" (UniqueName: \"kubernetes.io/projected/5f3560a7-a7f9-4497-844f-360796f6ece8-kube-api-access-dskx9\") on node \"crc\" DevicePath \"\"" Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.365021 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f3560a7-a7f9-4497-844f-360796f6ece8-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.365033 4802 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f3560a7-a7f9-4497-844f-360796f6ece8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.717251 4802 generic.go:334] "Generic (PLEG): container finished" podID="5f3560a7-a7f9-4497-844f-360796f6ece8" containerID="b9bf9a05bbf9223f5d57d3f0a8f43af24c72217cdb2e020edf665fc52573e0f9" exitCode=0 Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.717437 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-6hncq" event={"ID":"5f3560a7-a7f9-4497-844f-360796f6ece8","Type":"ContainerDied","Data":"b9bf9a05bbf9223f5d57d3f0a8f43af24c72217cdb2e020edf665fc52573e0f9"} Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.717569 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-6hncq" Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.718188 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-6hncq" event={"ID":"5f3560a7-a7f9-4497-844f-360796f6ece8","Type":"ContainerDied","Data":"a8ca37e3a4e3eea09819804a8ab0bf1934ae4f63c7475e5e5da1284115cc64c9"} Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.718393 4802 scope.go:117] "RemoveContainer" containerID="b9bf9a05bbf9223f5d57d3f0a8f43af24c72217cdb2e020edf665fc52573e0f9" Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.719645 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.751263 4802 scope.go:117] "RemoveContainer" containerID="00b584710b64dc43a7acec553e2a2c02920a149b1b6b1dcabed672e56321e9ee" Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.762176 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-6hncq"] Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.772797 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.775080 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-6hncq"] Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.778371 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.783642 4802 scope.go:117] "RemoveContainer" containerID="b9bf9a05bbf9223f5d57d3f0a8f43af24c72217cdb2e020edf665fc52573e0f9" Dec 01 20:16:31 crc kubenswrapper[4802]: E1201 20:16:31.784020 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9bf9a05bbf9223f5d57d3f0a8f43af24c72217cdb2e020edf665fc52573e0f9\": container with ID starting with b9bf9a05bbf9223f5d57d3f0a8f43af24c72217cdb2e020edf665fc52573e0f9 not found: ID does not exist" containerID="b9bf9a05bbf9223f5d57d3f0a8f43af24c72217cdb2e020edf665fc52573e0f9" Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.784055 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9bf9a05bbf9223f5d57d3f0a8f43af24c72217cdb2e020edf665fc52573e0f9"} err="failed to get container status \"b9bf9a05bbf9223f5d57d3f0a8f43af24c72217cdb2e020edf665fc52573e0f9\": rpc error: code = NotFound desc = could not find container \"b9bf9a05bbf9223f5d57d3f0a8f43af24c72217cdb2e020edf665fc52573e0f9\": container with ID starting with b9bf9a05bbf9223f5d57d3f0a8f43af24c72217cdb2e020edf665fc52573e0f9 not found: ID does not exist" Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.784081 4802 scope.go:117] "RemoveContainer" containerID="00b584710b64dc43a7acec553e2a2c02920a149b1b6b1dcabed672e56321e9ee" Dec 01 20:16:31 crc kubenswrapper[4802]: E1201 20:16:31.784399 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00b584710b64dc43a7acec553e2a2c02920a149b1b6b1dcabed672e56321e9ee\": container with ID starting with 00b584710b64dc43a7acec553e2a2c02920a149b1b6b1dcabed672e56321e9ee not found: ID does not exist" containerID="00b584710b64dc43a7acec553e2a2c02920a149b1b6b1dcabed672e56321e9ee" Dec 01 20:16:31 crc kubenswrapper[4802]: I1201 20:16:31.784455 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00b584710b64dc43a7acec553e2a2c02920a149b1b6b1dcabed672e56321e9ee"} err="failed to get container status \"00b584710b64dc43a7acec553e2a2c02920a149b1b6b1dcabed672e56321e9ee\": rpc error: code = NotFound desc = could not find container \"00b584710b64dc43a7acec553e2a2c02920a149b1b6b1dcabed672e56321e9ee\": container with ID starting with 00b584710b64dc43a7acec553e2a2c02920a149b1b6b1dcabed672e56321e9ee not found: ID does not exist" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.031103 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 01 20:16:32 crc kubenswrapper[4802]: E1201 20:16:32.031500 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f3560a7-a7f9-4497-844f-360796f6ece8" containerName="init" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.031516 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f3560a7-a7f9-4497-844f-360796f6ece8" containerName="init" Dec 01 20:16:32 crc kubenswrapper[4802]: E1201 20:16:32.031528 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f3560a7-a7f9-4497-844f-360796f6ece8" containerName="dnsmasq-dns" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.031536 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f3560a7-a7f9-4497-844f-360796f6ece8" containerName="dnsmasq-dns" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.031693 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f3560a7-a7f9-4497-844f-360796f6ece8" containerName="dnsmasq-dns" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.032532 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.044324 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.044511 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.047547 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.047628 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-bwd2x" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.057450 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.077070 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a\") " pod="openstack/ovn-northd-0" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.077130 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a-scripts\") pod \"ovn-northd-0\" (UID: \"d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a\") " pod="openstack/ovn-northd-0" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.077171 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a-config\") pod \"ovn-northd-0\" (UID: \"d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a\") " pod="openstack/ovn-northd-0" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.077234 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a\") " pod="openstack/ovn-northd-0" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.077324 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a\") " pod="openstack/ovn-northd-0" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.077349 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a\") " pod="openstack/ovn-northd-0" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.077379 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmdzk\" (UniqueName: \"kubernetes.io/projected/d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a-kube-api-access-hmdzk\") pod \"ovn-northd-0\" (UID: \"d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a\") " pod="openstack/ovn-northd-0" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.178477 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a\") " pod="openstack/ovn-northd-0" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.178528 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a\") " pod="openstack/ovn-northd-0" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.178560 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmdzk\" (UniqueName: \"kubernetes.io/projected/d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a-kube-api-access-hmdzk\") pod \"ovn-northd-0\" (UID: \"d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a\") " pod="openstack/ovn-northd-0" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.178583 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a\") " pod="openstack/ovn-northd-0" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.178611 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a-scripts\") pod \"ovn-northd-0\" (UID: \"d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a\") " pod="openstack/ovn-northd-0" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.178639 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a-config\") pod \"ovn-northd-0\" (UID: \"d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a\") " pod="openstack/ovn-northd-0" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.178672 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a\") " pod="openstack/ovn-northd-0" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.179254 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a\") " pod="openstack/ovn-northd-0" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.179915 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a-scripts\") pod \"ovn-northd-0\" (UID: \"d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a\") " pod="openstack/ovn-northd-0" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.180520 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a-config\") pod \"ovn-northd-0\" (UID: \"d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a\") " pod="openstack/ovn-northd-0" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.183902 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a\") " pod="openstack/ovn-northd-0" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.187931 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a\") " pod="openstack/ovn-northd-0" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.191744 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a\") " pod="openstack/ovn-northd-0" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.216180 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmdzk\" (UniqueName: \"kubernetes.io/projected/d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a-kube-api-access-hmdzk\") pod \"ovn-northd-0\" (UID: \"d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a\") " pod="openstack/ovn-northd-0" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.356334 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 01 20:16:32 crc kubenswrapper[4802]: E1201 20:16:32.500608 4802 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.151:36690->38.102.83.151:34181: write tcp 38.102.83.151:36690->38.102.83.151:34181: write: broken pipe Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.729166 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f3560a7-a7f9-4497-844f-360796f6ece8" path="/var/lib/kubelet/pods/5f3560a7-a7f9-4497-844f-360796f6ece8/volumes" Dec 01 20:16:32 crc kubenswrapper[4802]: I1201 20:16:32.842589 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 01 20:16:33 crc kubenswrapper[4802]: I1201 20:16:33.733765 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a","Type":"ContainerStarted","Data":"1b4fd7ac1f082339f81194967cc3937d0d7b12fdea9484ce938206569158bfb7"} Dec 01 20:16:34 crc kubenswrapper[4802]: I1201 20:16:34.221709 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 01 20:16:34 crc kubenswrapper[4802]: I1201 20:16:34.221994 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 01 20:16:34 crc kubenswrapper[4802]: I1201 20:16:34.315554 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 01 20:16:34 crc kubenswrapper[4802]: I1201 20:16:34.850377 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.208250 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-wbzkj"] Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.210761 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wbzkj" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.268043 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-wbzkj"] Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.330267 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aebb6147-155e-4021-88e7-19d2f1c2ffff-operator-scripts\") pod \"keystone-db-create-wbzkj\" (UID: \"aebb6147-155e-4021-88e7-19d2f1c2ffff\") " pod="openstack/keystone-db-create-wbzkj" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.330397 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rnwf\" (UniqueName: \"kubernetes.io/projected/aebb6147-155e-4021-88e7-19d2f1c2ffff-kube-api-access-4rnwf\") pod \"keystone-db-create-wbzkj\" (UID: \"aebb6147-155e-4021-88e7-19d2f1c2ffff\") " pod="openstack/keystone-db-create-wbzkj" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.355460 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5e7f-account-create-update-dt8m9"] Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.357849 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5e7f-account-create-update-dt8m9" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.379425 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.379948 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.379986 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.435489 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rnwf\" (UniqueName: \"kubernetes.io/projected/aebb6147-155e-4021-88e7-19d2f1c2ffff-kube-api-access-4rnwf\") pod \"keystone-db-create-wbzkj\" (UID: \"aebb6147-155e-4021-88e7-19d2f1c2ffff\") " pod="openstack/keystone-db-create-wbzkj" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.435671 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aebb6147-155e-4021-88e7-19d2f1c2ffff-operator-scripts\") pod \"keystone-db-create-wbzkj\" (UID: \"aebb6147-155e-4021-88e7-19d2f1c2ffff\") " pod="openstack/keystone-db-create-wbzkj" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.435887 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5e7f-account-create-update-dt8m9"] Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.441609 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aebb6147-155e-4021-88e7-19d2f1c2ffff-operator-scripts\") pod \"keystone-db-create-wbzkj\" (UID: \"aebb6147-155e-4021-88e7-19d2f1c2ffff\") " pod="openstack/keystone-db-create-wbzkj" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.456072 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-rxltz"] Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.457462 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rxltz" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.464048 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rnwf\" (UniqueName: \"kubernetes.io/projected/aebb6147-155e-4021-88e7-19d2f1c2ffff-kube-api-access-4rnwf\") pod \"keystone-db-create-wbzkj\" (UID: \"aebb6147-155e-4021-88e7-19d2f1c2ffff\") " pod="openstack/keystone-db-create-wbzkj" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.469051 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rxltz"] Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.524927 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wbzkj" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.539268 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/703d4a88-3292-40bc-871c-6e87449826d0-operator-scripts\") pod \"keystone-5e7f-account-create-update-dt8m9\" (UID: \"703d4a88-3292-40bc-871c-6e87449826d0\") " pod="openstack/keystone-5e7f-account-create-update-dt8m9" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.539414 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsnfg\" (UniqueName: \"kubernetes.io/projected/703d4a88-3292-40bc-871c-6e87449826d0-kube-api-access-qsnfg\") pod \"keystone-5e7f-account-create-update-dt8m9\" (UID: \"703d4a88-3292-40bc-871c-6e87449826d0\") " pod="openstack/keystone-5e7f-account-create-update-dt8m9" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.556311 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db78-account-create-update-t5b96"] Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.565713 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db78-account-create-update-t5b96"] Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.565901 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db78-account-create-update-t5b96" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.568116 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.690432 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjjgn\" (UniqueName: \"kubernetes.io/projected/5842d665-da28-4918-abcc-106446b09206-kube-api-access-pjjgn\") pod \"placement-db78-account-create-update-t5b96\" (UID: \"5842d665-da28-4918-abcc-106446b09206\") " pod="openstack/placement-db78-account-create-update-t5b96" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.690537 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsnfg\" (UniqueName: \"kubernetes.io/projected/703d4a88-3292-40bc-871c-6e87449826d0-kube-api-access-qsnfg\") pod \"keystone-5e7f-account-create-update-dt8m9\" (UID: \"703d4a88-3292-40bc-871c-6e87449826d0\") " pod="openstack/keystone-5e7f-account-create-update-dt8m9" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.690578 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5918a7fb-54b8-45ff-9f5d-0c86f93553fe-operator-scripts\") pod \"placement-db-create-rxltz\" (UID: \"5918a7fb-54b8-45ff-9f5d-0c86f93553fe\") " pod="openstack/placement-db-create-rxltz" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.690597 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2nth\" (UniqueName: \"kubernetes.io/projected/5918a7fb-54b8-45ff-9f5d-0c86f93553fe-kube-api-access-q2nth\") pod \"placement-db-create-rxltz\" (UID: \"5918a7fb-54b8-45ff-9f5d-0c86f93553fe\") " pod="openstack/placement-db-create-rxltz" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.690619 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5842d665-da28-4918-abcc-106446b09206-operator-scripts\") pod \"placement-db78-account-create-update-t5b96\" (UID: \"5842d665-da28-4918-abcc-106446b09206\") " pod="openstack/placement-db78-account-create-update-t5b96" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.690666 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/703d4a88-3292-40bc-871c-6e87449826d0-operator-scripts\") pod \"keystone-5e7f-account-create-update-dt8m9\" (UID: \"703d4a88-3292-40bc-871c-6e87449826d0\") " pod="openstack/keystone-5e7f-account-create-update-dt8m9" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.691365 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/703d4a88-3292-40bc-871c-6e87449826d0-operator-scripts\") pod \"keystone-5e7f-account-create-update-dt8m9\" (UID: \"703d4a88-3292-40bc-871c-6e87449826d0\") " pod="openstack/keystone-5e7f-account-create-update-dt8m9" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.709456 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsnfg\" (UniqueName: \"kubernetes.io/projected/703d4a88-3292-40bc-871c-6e87449826d0-kube-api-access-qsnfg\") pod \"keystone-5e7f-account-create-update-dt8m9\" (UID: \"703d4a88-3292-40bc-871c-6e87449826d0\") " pod="openstack/keystone-5e7f-account-create-update-dt8m9" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.752274 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5e7f-account-create-update-dt8m9" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.758130 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a","Type":"ContainerStarted","Data":"48de826b86b7886230ace282826d02c53bd45f8687fdd3d5c157060857689bcb"} Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.758168 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a","Type":"ContainerStarted","Data":"55306c78c6029496999e13da115955b4e0b0144d23dd16166dad33463b12739b"} Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.758219 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.785614 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.087507792 podStartE2EDuration="4.785599393s" podCreationTimestamp="2025-12-01 20:16:31 +0000 UTC" firstStartedPulling="2025-12-01 20:16:32.853654476 +0000 UTC m=+1214.416214167" lastFinishedPulling="2025-12-01 20:16:34.551746127 +0000 UTC m=+1216.114305768" observedRunningTime="2025-12-01 20:16:35.785123138 +0000 UTC m=+1217.347682779" watchObservedRunningTime="2025-12-01 20:16:35.785599393 +0000 UTC m=+1217.348159034" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.791648 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5842d665-da28-4918-abcc-106446b09206-operator-scripts\") pod \"placement-db78-account-create-update-t5b96\" (UID: \"5842d665-da28-4918-abcc-106446b09206\") " pod="openstack/placement-db78-account-create-update-t5b96" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.791751 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjjgn\" (UniqueName: \"kubernetes.io/projected/5842d665-da28-4918-abcc-106446b09206-kube-api-access-pjjgn\") pod \"placement-db78-account-create-update-t5b96\" (UID: \"5842d665-da28-4918-abcc-106446b09206\") " pod="openstack/placement-db78-account-create-update-t5b96" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.791841 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5918a7fb-54b8-45ff-9f5d-0c86f93553fe-operator-scripts\") pod \"placement-db-create-rxltz\" (UID: \"5918a7fb-54b8-45ff-9f5d-0c86f93553fe\") " pod="openstack/placement-db-create-rxltz" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.791858 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2nth\" (UniqueName: \"kubernetes.io/projected/5918a7fb-54b8-45ff-9f5d-0c86f93553fe-kube-api-access-q2nth\") pod \"placement-db-create-rxltz\" (UID: \"5918a7fb-54b8-45ff-9f5d-0c86f93553fe\") " pod="openstack/placement-db-create-rxltz" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.792938 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5842d665-da28-4918-abcc-106446b09206-operator-scripts\") pod \"placement-db78-account-create-update-t5b96\" (UID: \"5842d665-da28-4918-abcc-106446b09206\") " pod="openstack/placement-db78-account-create-update-t5b96" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.792981 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5918a7fb-54b8-45ff-9f5d-0c86f93553fe-operator-scripts\") pod \"placement-db-create-rxltz\" (UID: \"5918a7fb-54b8-45ff-9f5d-0c86f93553fe\") " pod="openstack/placement-db-create-rxltz" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.815401 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjjgn\" (UniqueName: \"kubernetes.io/projected/5842d665-da28-4918-abcc-106446b09206-kube-api-access-pjjgn\") pod \"placement-db78-account-create-update-t5b96\" (UID: \"5842d665-da28-4918-abcc-106446b09206\") " pod="openstack/placement-db78-account-create-update-t5b96" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.815636 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2nth\" (UniqueName: \"kubernetes.io/projected/5918a7fb-54b8-45ff-9f5d-0c86f93553fe-kube-api-access-q2nth\") pod \"placement-db-create-rxltz\" (UID: \"5918a7fb-54b8-45ff-9f5d-0c86f93553fe\") " pod="openstack/placement-db-create-rxltz" Dec 01 20:16:35 crc kubenswrapper[4802]: I1201 20:16:35.914229 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db78-account-create-update-t5b96" Dec 01 20:16:36 crc kubenswrapper[4802]: I1201 20:16:36.085072 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rxltz" Dec 01 20:16:36 crc kubenswrapper[4802]: I1201 20:16:36.132909 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-wbzkj"] Dec 01 20:16:36 crc kubenswrapper[4802]: W1201 20:16:36.140013 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaebb6147_155e_4021_88e7_19d2f1c2ffff.slice/crio-def296e0ad62ff92343a7696bfd55877913be24b6f03ed4720f5032eb6d6b101 WatchSource:0}: Error finding container def296e0ad62ff92343a7696bfd55877913be24b6f03ed4720f5032eb6d6b101: Status 404 returned error can't find the container with id def296e0ad62ff92343a7696bfd55877913be24b6f03ed4720f5032eb6d6b101 Dec 01 20:16:36 crc kubenswrapper[4802]: I1201 20:16:36.258009 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5e7f-account-create-update-dt8m9"] Dec 01 20:16:36 crc kubenswrapper[4802]: I1201 20:16:36.432941 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db78-account-create-update-t5b96"] Dec 01 20:16:36 crc kubenswrapper[4802]: I1201 20:16:36.645919 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rxltz"] Dec 01 20:16:36 crc kubenswrapper[4802]: E1201 20:16:36.694091 4802 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaebb6147_155e_4021_88e7_19d2f1c2ffff.slice/crio-conmon-e8cbb1c3d378658de37cdab71495281c580ad159bbdc2ad6b4b0be535009c990.scope\": RecentStats: unable to find data in memory cache]" Dec 01 20:16:36 crc kubenswrapper[4802]: I1201 20:16:36.766916 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db78-account-create-update-t5b96" event={"ID":"5842d665-da28-4918-abcc-106446b09206","Type":"ContainerStarted","Data":"388374400d4ec3c1e2b217b0cc1f3926f2c185f4da72a3811c04ae3808b0ecae"} Dec 01 20:16:36 crc kubenswrapper[4802]: I1201 20:16:36.769049 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5e7f-account-create-update-dt8m9" event={"ID":"703d4a88-3292-40bc-871c-6e87449826d0","Type":"ContainerStarted","Data":"6072232d7c220a592ce53e9e8abe53cc3a9c007c8c9d7210e98e1559e95f746b"} Dec 01 20:16:36 crc kubenswrapper[4802]: I1201 20:16:36.769087 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5e7f-account-create-update-dt8m9" event={"ID":"703d4a88-3292-40bc-871c-6e87449826d0","Type":"ContainerStarted","Data":"c056e2fc1b815f13f93f6375219dfb2c6e7250f33109abd80781a9dc423d45d5"} Dec 01 20:16:36 crc kubenswrapper[4802]: I1201 20:16:36.770856 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rxltz" event={"ID":"5918a7fb-54b8-45ff-9f5d-0c86f93553fe","Type":"ContainerStarted","Data":"10747bcec39604c0e9537b7fd6a3fec2301c0758f8af6c135475ced626dd1d80"} Dec 01 20:16:36 crc kubenswrapper[4802]: I1201 20:16:36.780884 4802 generic.go:334] "Generic (PLEG): container finished" podID="aebb6147-155e-4021-88e7-19d2f1c2ffff" containerID="e8cbb1c3d378658de37cdab71495281c580ad159bbdc2ad6b4b0be535009c990" exitCode=0 Dec 01 20:16:36 crc kubenswrapper[4802]: I1201 20:16:36.781036 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wbzkj" event={"ID":"aebb6147-155e-4021-88e7-19d2f1c2ffff","Type":"ContainerDied","Data":"e8cbb1c3d378658de37cdab71495281c580ad159bbdc2ad6b4b0be535009c990"} Dec 01 20:16:36 crc kubenswrapper[4802]: I1201 20:16:36.781088 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wbzkj" event={"ID":"aebb6147-155e-4021-88e7-19d2f1c2ffff","Type":"ContainerStarted","Data":"def296e0ad62ff92343a7696bfd55877913be24b6f03ed4720f5032eb6d6b101"} Dec 01 20:16:36 crc kubenswrapper[4802]: I1201 20:16:36.928132 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5e7f-account-create-update-dt8m9" podStartSLOduration=1.928116305 podStartE2EDuration="1.928116305s" podCreationTimestamp="2025-12-01 20:16:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:16:36.906251218 +0000 UTC m=+1218.468810879" watchObservedRunningTime="2025-12-01 20:16:36.928116305 +0000 UTC m=+1218.490675956" Dec 01 20:16:37 crc kubenswrapper[4802]: I1201 20:16:37.498433 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 01 20:16:37 crc kubenswrapper[4802]: I1201 20:16:37.776709 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 01 20:16:37 crc kubenswrapper[4802]: I1201 20:16:37.789851 4802 generic.go:334] "Generic (PLEG): container finished" podID="5842d665-da28-4918-abcc-106446b09206" containerID="72a32f2baec80aaaadb56bd8ad611201a57f152a27790c639c20bf86d685e53e" exitCode=0 Dec 01 20:16:37 crc kubenswrapper[4802]: I1201 20:16:37.789935 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db78-account-create-update-t5b96" event={"ID":"5842d665-da28-4918-abcc-106446b09206","Type":"ContainerDied","Data":"72a32f2baec80aaaadb56bd8ad611201a57f152a27790c639c20bf86d685e53e"} Dec 01 20:16:37 crc kubenswrapper[4802]: I1201 20:16:37.791623 4802 generic.go:334] "Generic (PLEG): container finished" podID="703d4a88-3292-40bc-871c-6e87449826d0" containerID="6072232d7c220a592ce53e9e8abe53cc3a9c007c8c9d7210e98e1559e95f746b" exitCode=0 Dec 01 20:16:37 crc kubenswrapper[4802]: I1201 20:16:37.791692 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5e7f-account-create-update-dt8m9" event={"ID":"703d4a88-3292-40bc-871c-6e87449826d0","Type":"ContainerDied","Data":"6072232d7c220a592ce53e9e8abe53cc3a9c007c8c9d7210e98e1559e95f746b"} Dec 01 20:16:37 crc kubenswrapper[4802]: I1201 20:16:37.793520 4802 generic.go:334] "Generic (PLEG): container finished" podID="5918a7fb-54b8-45ff-9f5d-0c86f93553fe" containerID="d6353307613dfc077353cf064cdc66614f634042a3d0013764a4524c8f6257f6" exitCode=0 Dec 01 20:16:37 crc kubenswrapper[4802]: I1201 20:16:37.793585 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rxltz" event={"ID":"5918a7fb-54b8-45ff-9f5d-0c86f93553fe","Type":"ContainerDied","Data":"d6353307613dfc077353cf064cdc66614f634042a3d0013764a4524c8f6257f6"} Dec 01 20:16:37 crc kubenswrapper[4802]: I1201 20:16:37.885226 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 01 20:16:38 crc kubenswrapper[4802]: I1201 20:16:38.220745 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wbzkj" Dec 01 20:16:38 crc kubenswrapper[4802]: I1201 20:16:38.346680 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aebb6147-155e-4021-88e7-19d2f1c2ffff-operator-scripts\") pod \"aebb6147-155e-4021-88e7-19d2f1c2ffff\" (UID: \"aebb6147-155e-4021-88e7-19d2f1c2ffff\") " Dec 01 20:16:38 crc kubenswrapper[4802]: I1201 20:16:38.346848 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rnwf\" (UniqueName: \"kubernetes.io/projected/aebb6147-155e-4021-88e7-19d2f1c2ffff-kube-api-access-4rnwf\") pod \"aebb6147-155e-4021-88e7-19d2f1c2ffff\" (UID: \"aebb6147-155e-4021-88e7-19d2f1c2ffff\") " Dec 01 20:16:38 crc kubenswrapper[4802]: I1201 20:16:38.347496 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aebb6147-155e-4021-88e7-19d2f1c2ffff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aebb6147-155e-4021-88e7-19d2f1c2ffff" (UID: "aebb6147-155e-4021-88e7-19d2f1c2ffff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:16:38 crc kubenswrapper[4802]: I1201 20:16:38.356322 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aebb6147-155e-4021-88e7-19d2f1c2ffff-kube-api-access-4rnwf" (OuterVolumeSpecName: "kube-api-access-4rnwf") pod "aebb6147-155e-4021-88e7-19d2f1c2ffff" (UID: "aebb6147-155e-4021-88e7-19d2f1c2ffff"). InnerVolumeSpecName "kube-api-access-4rnwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:16:38 crc kubenswrapper[4802]: I1201 20:16:38.449038 4802 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aebb6147-155e-4021-88e7-19d2f1c2ffff-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:16:38 crc kubenswrapper[4802]: I1201 20:16:38.449079 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rnwf\" (UniqueName: \"kubernetes.io/projected/aebb6147-155e-4021-88e7-19d2f1c2ffff-kube-api-access-4rnwf\") on node \"crc\" DevicePath \"\"" Dec 01 20:16:38 crc kubenswrapper[4802]: I1201 20:16:38.803422 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wbzkj" event={"ID":"aebb6147-155e-4021-88e7-19d2f1c2ffff","Type":"ContainerDied","Data":"def296e0ad62ff92343a7696bfd55877913be24b6f03ed4720f5032eb6d6b101"} Dec 01 20:16:38 crc kubenswrapper[4802]: I1201 20:16:38.803500 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="def296e0ad62ff92343a7696bfd55877913be24b6f03ed4720f5032eb6d6b101" Dec 01 20:16:38 crc kubenswrapper[4802]: I1201 20:16:38.803608 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wbzkj" Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.228080 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rxltz" Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.362362 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5918a7fb-54b8-45ff-9f5d-0c86f93553fe-operator-scripts\") pod \"5918a7fb-54b8-45ff-9f5d-0c86f93553fe\" (UID: \"5918a7fb-54b8-45ff-9f5d-0c86f93553fe\") " Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.362505 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2nth\" (UniqueName: \"kubernetes.io/projected/5918a7fb-54b8-45ff-9f5d-0c86f93553fe-kube-api-access-q2nth\") pod \"5918a7fb-54b8-45ff-9f5d-0c86f93553fe\" (UID: \"5918a7fb-54b8-45ff-9f5d-0c86f93553fe\") " Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.363795 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5918a7fb-54b8-45ff-9f5d-0c86f93553fe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5918a7fb-54b8-45ff-9f5d-0c86f93553fe" (UID: "5918a7fb-54b8-45ff-9f5d-0c86f93553fe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.369464 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5918a7fb-54b8-45ff-9f5d-0c86f93553fe-kube-api-access-q2nth" (OuterVolumeSpecName: "kube-api-access-q2nth") pod "5918a7fb-54b8-45ff-9f5d-0c86f93553fe" (UID: "5918a7fb-54b8-45ff-9f5d-0c86f93553fe"). InnerVolumeSpecName "kube-api-access-q2nth". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.374526 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5e7f-account-create-update-dt8m9" Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.378188 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db78-account-create-update-t5b96" Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.464047 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/703d4a88-3292-40bc-871c-6e87449826d0-operator-scripts\") pod \"703d4a88-3292-40bc-871c-6e87449826d0\" (UID: \"703d4a88-3292-40bc-871c-6e87449826d0\") " Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.464513 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsnfg\" (UniqueName: \"kubernetes.io/projected/703d4a88-3292-40bc-871c-6e87449826d0-kube-api-access-qsnfg\") pod \"703d4a88-3292-40bc-871c-6e87449826d0\" (UID: \"703d4a88-3292-40bc-871c-6e87449826d0\") " Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.464541 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjjgn\" (UniqueName: \"kubernetes.io/projected/5842d665-da28-4918-abcc-106446b09206-kube-api-access-pjjgn\") pod \"5842d665-da28-4918-abcc-106446b09206\" (UID: \"5842d665-da28-4918-abcc-106446b09206\") " Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.464581 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5842d665-da28-4918-abcc-106446b09206-operator-scripts\") pod \"5842d665-da28-4918-abcc-106446b09206\" (UID: \"5842d665-da28-4918-abcc-106446b09206\") " Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.464690 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/703d4a88-3292-40bc-871c-6e87449826d0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "703d4a88-3292-40bc-871c-6e87449826d0" (UID: "703d4a88-3292-40bc-871c-6e87449826d0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.464906 4802 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5918a7fb-54b8-45ff-9f5d-0c86f93553fe-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.464923 4802 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/703d4a88-3292-40bc-871c-6e87449826d0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.464932 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2nth\" (UniqueName: \"kubernetes.io/projected/5918a7fb-54b8-45ff-9f5d-0c86f93553fe-kube-api-access-q2nth\") on node \"crc\" DevicePath \"\"" Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.465461 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5842d665-da28-4918-abcc-106446b09206-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5842d665-da28-4918-abcc-106446b09206" (UID: "5842d665-da28-4918-abcc-106446b09206"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.468127 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5842d665-da28-4918-abcc-106446b09206-kube-api-access-pjjgn" (OuterVolumeSpecName: "kube-api-access-pjjgn") pod "5842d665-da28-4918-abcc-106446b09206" (UID: "5842d665-da28-4918-abcc-106446b09206"). InnerVolumeSpecName "kube-api-access-pjjgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.468227 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/703d4a88-3292-40bc-871c-6e87449826d0-kube-api-access-qsnfg" (OuterVolumeSpecName: "kube-api-access-qsnfg") pod "703d4a88-3292-40bc-871c-6e87449826d0" (UID: "703d4a88-3292-40bc-871c-6e87449826d0"). InnerVolumeSpecName "kube-api-access-qsnfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.566825 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsnfg\" (UniqueName: \"kubernetes.io/projected/703d4a88-3292-40bc-871c-6e87449826d0-kube-api-access-qsnfg\") on node \"crc\" DevicePath \"\"" Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.566874 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjjgn\" (UniqueName: \"kubernetes.io/projected/5842d665-da28-4918-abcc-106446b09206-kube-api-access-pjjgn\") on node \"crc\" DevicePath \"\"" Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.566895 4802 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5842d665-da28-4918-abcc-106446b09206-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.814474 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db78-account-create-update-t5b96" Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.814501 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db78-account-create-update-t5b96" event={"ID":"5842d665-da28-4918-abcc-106446b09206","Type":"ContainerDied","Data":"388374400d4ec3c1e2b217b0cc1f3926f2c185f4da72a3811c04ae3808b0ecae"} Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.814567 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="388374400d4ec3c1e2b217b0cc1f3926f2c185f4da72a3811c04ae3808b0ecae" Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.816947 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5e7f-account-create-update-dt8m9" event={"ID":"703d4a88-3292-40bc-871c-6e87449826d0","Type":"ContainerDied","Data":"c056e2fc1b815f13f93f6375219dfb2c6e7250f33109abd80781a9dc423d45d5"} Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.816986 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c056e2fc1b815f13f93f6375219dfb2c6e7250f33109abd80781a9dc423d45d5" Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.817074 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5e7f-account-create-update-dt8m9" Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.821074 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rxltz" event={"ID":"5918a7fb-54b8-45ff-9f5d-0c86f93553fe","Type":"ContainerDied","Data":"10747bcec39604c0e9537b7fd6a3fec2301c0758f8af6c135475ced626dd1d80"} Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.821281 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10747bcec39604c0e9537b7fd6a3fec2301c0758f8af6c135475ced626dd1d80" Dec 01 20:16:39 crc kubenswrapper[4802]: I1201 20:16:39.821181 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rxltz" Dec 01 20:16:40 crc kubenswrapper[4802]: I1201 20:16:40.827877 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-7mqtg"] Dec 01 20:16:40 crc kubenswrapper[4802]: E1201 20:16:40.828241 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5842d665-da28-4918-abcc-106446b09206" containerName="mariadb-account-create-update" Dec 01 20:16:40 crc kubenswrapper[4802]: I1201 20:16:40.828256 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="5842d665-da28-4918-abcc-106446b09206" containerName="mariadb-account-create-update" Dec 01 20:16:40 crc kubenswrapper[4802]: E1201 20:16:40.828271 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5918a7fb-54b8-45ff-9f5d-0c86f93553fe" containerName="mariadb-database-create" Dec 01 20:16:40 crc kubenswrapper[4802]: I1201 20:16:40.828277 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="5918a7fb-54b8-45ff-9f5d-0c86f93553fe" containerName="mariadb-database-create" Dec 01 20:16:40 crc kubenswrapper[4802]: E1201 20:16:40.828286 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="703d4a88-3292-40bc-871c-6e87449826d0" containerName="mariadb-account-create-update" Dec 01 20:16:40 crc kubenswrapper[4802]: I1201 20:16:40.828293 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="703d4a88-3292-40bc-871c-6e87449826d0" containerName="mariadb-account-create-update" Dec 01 20:16:40 crc kubenswrapper[4802]: E1201 20:16:40.828306 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aebb6147-155e-4021-88e7-19d2f1c2ffff" containerName="mariadb-database-create" Dec 01 20:16:40 crc kubenswrapper[4802]: I1201 20:16:40.828312 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="aebb6147-155e-4021-88e7-19d2f1c2ffff" containerName="mariadb-database-create" Dec 01 20:16:40 crc kubenswrapper[4802]: I1201 20:16:40.828452 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="aebb6147-155e-4021-88e7-19d2f1c2ffff" containerName="mariadb-database-create" Dec 01 20:16:40 crc kubenswrapper[4802]: I1201 20:16:40.828462 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="703d4a88-3292-40bc-871c-6e87449826d0" containerName="mariadb-account-create-update" Dec 01 20:16:40 crc kubenswrapper[4802]: I1201 20:16:40.828474 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="5842d665-da28-4918-abcc-106446b09206" containerName="mariadb-account-create-update" Dec 01 20:16:40 crc kubenswrapper[4802]: I1201 20:16:40.828490 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="5918a7fb-54b8-45ff-9f5d-0c86f93553fe" containerName="mariadb-database-create" Dec 01 20:16:40 crc kubenswrapper[4802]: I1201 20:16:40.829009 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7mqtg" Dec 01 20:16:40 crc kubenswrapper[4802]: I1201 20:16:40.839767 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-7mqtg"] Dec 01 20:16:40 crc kubenswrapper[4802]: I1201 20:16:40.925747 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-e376-account-create-update-rz2t8"] Dec 01 20:16:40 crc kubenswrapper[4802]: I1201 20:16:40.926836 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e376-account-create-update-rz2t8" Dec 01 20:16:40 crc kubenswrapper[4802]: I1201 20:16:40.929957 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 01 20:16:40 crc kubenswrapper[4802]: I1201 20:16:40.937114 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e376-account-create-update-rz2t8"] Dec 01 20:16:40 crc kubenswrapper[4802]: I1201 20:16:40.995695 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c415ac94-c3a7-4e60-953f-748556482cc6-operator-scripts\") pod \"glance-db-create-7mqtg\" (UID: \"c415ac94-c3a7-4e60-953f-748556482cc6\") " pod="openstack/glance-db-create-7mqtg" Dec 01 20:16:40 crc kubenswrapper[4802]: I1201 20:16:40.995840 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7jcv\" (UniqueName: \"kubernetes.io/projected/c415ac94-c3a7-4e60-953f-748556482cc6-kube-api-access-j7jcv\") pod \"glance-db-create-7mqtg\" (UID: \"c415ac94-c3a7-4e60-953f-748556482cc6\") " pod="openstack/glance-db-create-7mqtg" Dec 01 20:16:41 crc kubenswrapper[4802]: I1201 20:16:41.097339 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c415ac94-c3a7-4e60-953f-748556482cc6-operator-scripts\") pod \"glance-db-create-7mqtg\" (UID: \"c415ac94-c3a7-4e60-953f-748556482cc6\") " pod="openstack/glance-db-create-7mqtg" Dec 01 20:16:41 crc kubenswrapper[4802]: I1201 20:16:41.097431 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c488df2-71d2-4e82-9ac1-224ad92f0744-operator-scripts\") pod \"glance-e376-account-create-update-rz2t8\" (UID: \"1c488df2-71d2-4e82-9ac1-224ad92f0744\") " pod="openstack/glance-e376-account-create-update-rz2t8" Dec 01 20:16:41 crc kubenswrapper[4802]: I1201 20:16:41.097463 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7jcv\" (UniqueName: \"kubernetes.io/projected/c415ac94-c3a7-4e60-953f-748556482cc6-kube-api-access-j7jcv\") pod \"glance-db-create-7mqtg\" (UID: \"c415ac94-c3a7-4e60-953f-748556482cc6\") " pod="openstack/glance-db-create-7mqtg" Dec 01 20:16:41 crc kubenswrapper[4802]: I1201 20:16:41.097659 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd4kb\" (UniqueName: \"kubernetes.io/projected/1c488df2-71d2-4e82-9ac1-224ad92f0744-kube-api-access-xd4kb\") pod \"glance-e376-account-create-update-rz2t8\" (UID: \"1c488df2-71d2-4e82-9ac1-224ad92f0744\") " pod="openstack/glance-e376-account-create-update-rz2t8" Dec 01 20:16:41 crc kubenswrapper[4802]: I1201 20:16:41.098529 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c415ac94-c3a7-4e60-953f-748556482cc6-operator-scripts\") pod \"glance-db-create-7mqtg\" (UID: \"c415ac94-c3a7-4e60-953f-748556482cc6\") " pod="openstack/glance-db-create-7mqtg" Dec 01 20:16:41 crc kubenswrapper[4802]: I1201 20:16:41.115403 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7jcv\" (UniqueName: \"kubernetes.io/projected/c415ac94-c3a7-4e60-953f-748556482cc6-kube-api-access-j7jcv\") pod \"glance-db-create-7mqtg\" (UID: \"c415ac94-c3a7-4e60-953f-748556482cc6\") " pod="openstack/glance-db-create-7mqtg" Dec 01 20:16:41 crc kubenswrapper[4802]: I1201 20:16:41.145398 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7mqtg" Dec 01 20:16:41 crc kubenswrapper[4802]: I1201 20:16:41.199478 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c488df2-71d2-4e82-9ac1-224ad92f0744-operator-scripts\") pod \"glance-e376-account-create-update-rz2t8\" (UID: \"1c488df2-71d2-4e82-9ac1-224ad92f0744\") " pod="openstack/glance-e376-account-create-update-rz2t8" Dec 01 20:16:41 crc kubenswrapper[4802]: I1201 20:16:41.199566 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd4kb\" (UniqueName: \"kubernetes.io/projected/1c488df2-71d2-4e82-9ac1-224ad92f0744-kube-api-access-xd4kb\") pod \"glance-e376-account-create-update-rz2t8\" (UID: \"1c488df2-71d2-4e82-9ac1-224ad92f0744\") " pod="openstack/glance-e376-account-create-update-rz2t8" Dec 01 20:16:41 crc kubenswrapper[4802]: I1201 20:16:41.200391 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c488df2-71d2-4e82-9ac1-224ad92f0744-operator-scripts\") pod \"glance-e376-account-create-update-rz2t8\" (UID: \"1c488df2-71d2-4e82-9ac1-224ad92f0744\") " pod="openstack/glance-e376-account-create-update-rz2t8" Dec 01 20:16:41 crc kubenswrapper[4802]: I1201 20:16:41.364543 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd4kb\" (UniqueName: \"kubernetes.io/projected/1c488df2-71d2-4e82-9ac1-224ad92f0744-kube-api-access-xd4kb\") pod \"glance-e376-account-create-update-rz2t8\" (UID: \"1c488df2-71d2-4e82-9ac1-224ad92f0744\") " pod="openstack/glance-e376-account-create-update-rz2t8" Dec 01 20:16:41 crc kubenswrapper[4802]: I1201 20:16:41.543086 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e376-account-create-update-rz2t8" Dec 01 20:16:41 crc kubenswrapper[4802]: I1201 20:16:41.769014 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-7mqtg"] Dec 01 20:16:41 crc kubenswrapper[4802]: I1201 20:16:41.862738 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7mqtg" event={"ID":"c415ac94-c3a7-4e60-953f-748556482cc6","Type":"ContainerStarted","Data":"f1a903df0c42cb44fbdb256500280e3a4dfc7b4199bb64a57c575851a5af9df3"} Dec 01 20:16:41 crc kubenswrapper[4802]: I1201 20:16:41.891142 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e376-account-create-update-rz2t8"] Dec 01 20:16:42 crc kubenswrapper[4802]: I1201 20:16:42.873329 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e376-account-create-update-rz2t8" event={"ID":"1c488df2-71d2-4e82-9ac1-224ad92f0744","Type":"ContainerStarted","Data":"2dc31caa5a4c298678324bdec7cebd25f943ccff3b8a166748a8797eb78a4c6b"} Dec 01 20:16:43 crc kubenswrapper[4802]: I1201 20:16:43.884370 4802 generic.go:334] "Generic (PLEG): container finished" podID="1c488df2-71d2-4e82-9ac1-224ad92f0744" containerID="a46b6644d49134838ae8f1b2058232568f9524ff723e1943b9f2222077639a79" exitCode=0 Dec 01 20:16:43 crc kubenswrapper[4802]: I1201 20:16:43.884440 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e376-account-create-update-rz2t8" event={"ID":"1c488df2-71d2-4e82-9ac1-224ad92f0744","Type":"ContainerDied","Data":"a46b6644d49134838ae8f1b2058232568f9524ff723e1943b9f2222077639a79"} Dec 01 20:16:43 crc kubenswrapper[4802]: I1201 20:16:43.886334 4802 generic.go:334] "Generic (PLEG): container finished" podID="c415ac94-c3a7-4e60-953f-748556482cc6" containerID="55cf11a93b945ea13e7ce60da4d4057335a0570510ca7db223ef02378401fd9d" exitCode=0 Dec 01 20:16:43 crc kubenswrapper[4802]: I1201 20:16:43.886376 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7mqtg" event={"ID":"c415ac94-c3a7-4e60-953f-748556482cc6","Type":"ContainerDied","Data":"55cf11a93b945ea13e7ce60da4d4057335a0570510ca7db223ef02378401fd9d"} Dec 01 20:16:45 crc kubenswrapper[4802]: E1201 20:16:45.115889 4802 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Dec 01 20:16:45 crc kubenswrapper[4802]: I1201 20:16:45.339747 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e376-account-create-update-rz2t8" Dec 01 20:16:45 crc kubenswrapper[4802]: I1201 20:16:45.346629 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7mqtg" Dec 01 20:16:45 crc kubenswrapper[4802]: I1201 20:16:45.401437 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c488df2-71d2-4e82-9ac1-224ad92f0744-operator-scripts\") pod \"1c488df2-71d2-4e82-9ac1-224ad92f0744\" (UID: \"1c488df2-71d2-4e82-9ac1-224ad92f0744\") " Dec 01 20:16:45 crc kubenswrapper[4802]: I1201 20:16:45.401493 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd4kb\" (UniqueName: \"kubernetes.io/projected/1c488df2-71d2-4e82-9ac1-224ad92f0744-kube-api-access-xd4kb\") pod \"1c488df2-71d2-4e82-9ac1-224ad92f0744\" (UID: \"1c488df2-71d2-4e82-9ac1-224ad92f0744\") " Dec 01 20:16:45 crc kubenswrapper[4802]: I1201 20:16:45.401523 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c415ac94-c3a7-4e60-953f-748556482cc6-operator-scripts\") pod \"c415ac94-c3a7-4e60-953f-748556482cc6\" (UID: \"c415ac94-c3a7-4e60-953f-748556482cc6\") " Dec 01 20:16:45 crc kubenswrapper[4802]: I1201 20:16:45.401589 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7jcv\" (UniqueName: \"kubernetes.io/projected/c415ac94-c3a7-4e60-953f-748556482cc6-kube-api-access-j7jcv\") pod \"c415ac94-c3a7-4e60-953f-748556482cc6\" (UID: \"c415ac94-c3a7-4e60-953f-748556482cc6\") " Dec 01 20:16:45 crc kubenswrapper[4802]: I1201 20:16:45.405408 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c488df2-71d2-4e82-9ac1-224ad92f0744-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1c488df2-71d2-4e82-9ac1-224ad92f0744" (UID: "1c488df2-71d2-4e82-9ac1-224ad92f0744"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:16:45 crc kubenswrapper[4802]: I1201 20:16:45.405571 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c415ac94-c3a7-4e60-953f-748556482cc6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c415ac94-c3a7-4e60-953f-748556482cc6" (UID: "c415ac94-c3a7-4e60-953f-748556482cc6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:16:45 crc kubenswrapper[4802]: I1201 20:16:45.410405 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c488df2-71d2-4e82-9ac1-224ad92f0744-kube-api-access-xd4kb" (OuterVolumeSpecName: "kube-api-access-xd4kb") pod "1c488df2-71d2-4e82-9ac1-224ad92f0744" (UID: "1c488df2-71d2-4e82-9ac1-224ad92f0744"). InnerVolumeSpecName "kube-api-access-xd4kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:16:45 crc kubenswrapper[4802]: I1201 20:16:45.410491 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c415ac94-c3a7-4e60-953f-748556482cc6-kube-api-access-j7jcv" (OuterVolumeSpecName: "kube-api-access-j7jcv") pod "c415ac94-c3a7-4e60-953f-748556482cc6" (UID: "c415ac94-c3a7-4e60-953f-748556482cc6"). InnerVolumeSpecName "kube-api-access-j7jcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:16:45 crc kubenswrapper[4802]: I1201 20:16:45.502923 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7jcv\" (UniqueName: \"kubernetes.io/projected/c415ac94-c3a7-4e60-953f-748556482cc6-kube-api-access-j7jcv\") on node \"crc\" DevicePath \"\"" Dec 01 20:16:45 crc kubenswrapper[4802]: I1201 20:16:45.503267 4802 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c488df2-71d2-4e82-9ac1-224ad92f0744-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:16:45 crc kubenswrapper[4802]: I1201 20:16:45.503277 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd4kb\" (UniqueName: \"kubernetes.io/projected/1c488df2-71d2-4e82-9ac1-224ad92f0744-kube-api-access-xd4kb\") on node \"crc\" DevicePath \"\"" Dec 01 20:16:45 crc kubenswrapper[4802]: I1201 20:16:45.503289 4802 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c415ac94-c3a7-4e60-953f-748556482cc6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:16:45 crc kubenswrapper[4802]: I1201 20:16:45.905874 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e376-account-create-update-rz2t8" Dec 01 20:16:45 crc kubenswrapper[4802]: I1201 20:16:45.905885 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e376-account-create-update-rz2t8" event={"ID":"1c488df2-71d2-4e82-9ac1-224ad92f0744","Type":"ContainerDied","Data":"2dc31caa5a4c298678324bdec7cebd25f943ccff3b8a166748a8797eb78a4c6b"} Dec 01 20:16:45 crc kubenswrapper[4802]: I1201 20:16:45.905924 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dc31caa5a4c298678324bdec7cebd25f943ccff3b8a166748a8797eb78a4c6b" Dec 01 20:16:45 crc kubenswrapper[4802]: I1201 20:16:45.907534 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7mqtg" event={"ID":"c415ac94-c3a7-4e60-953f-748556482cc6","Type":"ContainerDied","Data":"f1a903df0c42cb44fbdb256500280e3a4dfc7b4199bb64a57c575851a5af9df3"} Dec 01 20:16:45 crc kubenswrapper[4802]: I1201 20:16:45.907565 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1a903df0c42cb44fbdb256500280e3a4dfc7b4199bb64a57c575851a5af9df3" Dec 01 20:16:45 crc kubenswrapper[4802]: I1201 20:16:45.907595 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7mqtg" Dec 01 20:16:47 crc kubenswrapper[4802]: I1201 20:16:47.444899 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 01 20:16:51 crc kubenswrapper[4802]: I1201 20:16:51.159177 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-rsnxk"] Dec 01 20:16:51 crc kubenswrapper[4802]: E1201 20:16:51.159802 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c415ac94-c3a7-4e60-953f-748556482cc6" containerName="mariadb-database-create" Dec 01 20:16:51 crc kubenswrapper[4802]: I1201 20:16:51.159815 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="c415ac94-c3a7-4e60-953f-748556482cc6" containerName="mariadb-database-create" Dec 01 20:16:51 crc kubenswrapper[4802]: E1201 20:16:51.159832 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c488df2-71d2-4e82-9ac1-224ad92f0744" containerName="mariadb-account-create-update" Dec 01 20:16:51 crc kubenswrapper[4802]: I1201 20:16:51.159838 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c488df2-71d2-4e82-9ac1-224ad92f0744" containerName="mariadb-account-create-update" Dec 01 20:16:51 crc kubenswrapper[4802]: I1201 20:16:51.159975 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c488df2-71d2-4e82-9ac1-224ad92f0744" containerName="mariadb-account-create-update" Dec 01 20:16:51 crc kubenswrapper[4802]: I1201 20:16:51.159984 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="c415ac94-c3a7-4e60-953f-748556482cc6" containerName="mariadb-database-create" Dec 01 20:16:51 crc kubenswrapper[4802]: I1201 20:16:51.160513 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rsnxk" Dec 01 20:16:51 crc kubenswrapper[4802]: I1201 20:16:51.163098 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 01 20:16:51 crc kubenswrapper[4802]: I1201 20:16:51.163958 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-kflk9" Dec 01 20:16:51 crc kubenswrapper[4802]: I1201 20:16:51.168914 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-rsnxk"] Dec 01 20:16:51 crc kubenswrapper[4802]: I1201 20:16:51.194806 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a-config-data\") pod \"glance-db-sync-rsnxk\" (UID: \"7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a\") " pod="openstack/glance-db-sync-rsnxk" Dec 01 20:16:51 crc kubenswrapper[4802]: I1201 20:16:51.194934 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgt5w\" (UniqueName: \"kubernetes.io/projected/7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a-kube-api-access-zgt5w\") pod \"glance-db-sync-rsnxk\" (UID: \"7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a\") " pod="openstack/glance-db-sync-rsnxk" Dec 01 20:16:51 crc kubenswrapper[4802]: I1201 20:16:51.195011 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a-combined-ca-bundle\") pod \"glance-db-sync-rsnxk\" (UID: \"7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a\") " pod="openstack/glance-db-sync-rsnxk" Dec 01 20:16:51 crc kubenswrapper[4802]: I1201 20:16:51.195064 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a-db-sync-config-data\") pod \"glance-db-sync-rsnxk\" (UID: \"7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a\") " pod="openstack/glance-db-sync-rsnxk" Dec 01 20:16:51 crc kubenswrapper[4802]: I1201 20:16:51.296814 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a-db-sync-config-data\") pod \"glance-db-sync-rsnxk\" (UID: \"7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a\") " pod="openstack/glance-db-sync-rsnxk" Dec 01 20:16:51 crc kubenswrapper[4802]: I1201 20:16:51.296976 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a-config-data\") pod \"glance-db-sync-rsnxk\" (UID: \"7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a\") " pod="openstack/glance-db-sync-rsnxk" Dec 01 20:16:51 crc kubenswrapper[4802]: I1201 20:16:51.297043 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgt5w\" (UniqueName: \"kubernetes.io/projected/7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a-kube-api-access-zgt5w\") pod \"glance-db-sync-rsnxk\" (UID: \"7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a\") " pod="openstack/glance-db-sync-rsnxk" Dec 01 20:16:51 crc kubenswrapper[4802]: I1201 20:16:51.297101 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a-combined-ca-bundle\") pod \"glance-db-sync-rsnxk\" (UID: \"7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a\") " pod="openstack/glance-db-sync-rsnxk" Dec 01 20:16:51 crc kubenswrapper[4802]: I1201 20:16:51.304778 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a-db-sync-config-data\") pod \"glance-db-sync-rsnxk\" (UID: \"7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a\") " pod="openstack/glance-db-sync-rsnxk" Dec 01 20:16:51 crc kubenswrapper[4802]: I1201 20:16:51.304977 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a-combined-ca-bundle\") pod \"glance-db-sync-rsnxk\" (UID: \"7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a\") " pod="openstack/glance-db-sync-rsnxk" Dec 01 20:16:51 crc kubenswrapper[4802]: I1201 20:16:51.314456 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgt5w\" (UniqueName: \"kubernetes.io/projected/7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a-kube-api-access-zgt5w\") pod \"glance-db-sync-rsnxk\" (UID: \"7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a\") " pod="openstack/glance-db-sync-rsnxk" Dec 01 20:16:51 crc kubenswrapper[4802]: I1201 20:16:51.314652 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a-config-data\") pod \"glance-db-sync-rsnxk\" (UID: \"7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a\") " pod="openstack/glance-db-sync-rsnxk" Dec 01 20:16:51 crc kubenswrapper[4802]: I1201 20:16:51.478236 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rsnxk" Dec 01 20:16:52 crc kubenswrapper[4802]: I1201 20:16:52.102605 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-rsnxk"] Dec 01 20:16:52 crc kubenswrapper[4802]: W1201 20:16:52.106599 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cff0cc2_bdb2_4b4d_8b91_51a4021dcc2a.slice/crio-563ffef3101684ca1b2fa2e1a3e18c9e9a0a9947717074e5a88e38620a0b1543 WatchSource:0}: Error finding container 563ffef3101684ca1b2fa2e1a3e18c9e9a0a9947717074e5a88e38620a0b1543: Status 404 returned error can't find the container with id 563ffef3101684ca1b2fa2e1a3e18c9e9a0a9947717074e5a88e38620a0b1543 Dec 01 20:16:52 crc kubenswrapper[4802]: I1201 20:16:52.962401 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rsnxk" event={"ID":"7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a","Type":"ContainerStarted","Data":"563ffef3101684ca1b2fa2e1a3e18c9e9a0a9947717074e5a88e38620a0b1543"} Dec 01 20:16:56 crc kubenswrapper[4802]: I1201 20:16:56.583349 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-j9thx" Dec 01 20:16:56 crc kubenswrapper[4802]: I1201 20:16:56.618018 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-j9thx" Dec 01 20:16:56 crc kubenswrapper[4802]: I1201 20:16:56.654840 4802 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-gczlr" podUID="4bbe4b6e-302e-4d6d-bc17-5f35baca1067" containerName="ovn-controller" probeResult="failure" output=< Dec 01 20:16:56 crc kubenswrapper[4802]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 01 20:16:56 crc kubenswrapper[4802]: > Dec 01 20:16:56 crc kubenswrapper[4802]: I1201 20:16:56.835723 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-gczlr-config-d2zwk"] Dec 01 20:16:56 crc kubenswrapper[4802]: I1201 20:16:56.842032 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gczlr-config-d2zwk" Dec 01 20:16:56 crc kubenswrapper[4802]: I1201 20:16:56.851083 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 01 20:16:56 crc kubenswrapper[4802]: I1201 20:16:56.884452 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gczlr-config-d2zwk"] Dec 01 20:16:57 crc kubenswrapper[4802]: I1201 20:16:57.000694 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c438fc60-ba0c-436d-906c-dd90d99a1e56-var-run-ovn\") pod \"ovn-controller-gczlr-config-d2zwk\" (UID: \"c438fc60-ba0c-436d-906c-dd90d99a1e56\") " pod="openstack/ovn-controller-gczlr-config-d2zwk" Dec 01 20:16:57 crc kubenswrapper[4802]: I1201 20:16:57.000845 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c438fc60-ba0c-436d-906c-dd90d99a1e56-var-run\") pod \"ovn-controller-gczlr-config-d2zwk\" (UID: \"c438fc60-ba0c-436d-906c-dd90d99a1e56\") " pod="openstack/ovn-controller-gczlr-config-d2zwk" Dec 01 20:16:57 crc kubenswrapper[4802]: I1201 20:16:57.000877 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c438fc60-ba0c-436d-906c-dd90d99a1e56-var-log-ovn\") pod \"ovn-controller-gczlr-config-d2zwk\" (UID: \"c438fc60-ba0c-436d-906c-dd90d99a1e56\") " pod="openstack/ovn-controller-gczlr-config-d2zwk" Dec 01 20:16:57 crc kubenswrapper[4802]: I1201 20:16:57.000909 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztgxv\" (UniqueName: \"kubernetes.io/projected/c438fc60-ba0c-436d-906c-dd90d99a1e56-kube-api-access-ztgxv\") pod \"ovn-controller-gczlr-config-d2zwk\" (UID: \"c438fc60-ba0c-436d-906c-dd90d99a1e56\") " pod="openstack/ovn-controller-gczlr-config-d2zwk" Dec 01 20:16:57 crc kubenswrapper[4802]: I1201 20:16:57.000975 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c438fc60-ba0c-436d-906c-dd90d99a1e56-scripts\") pod \"ovn-controller-gczlr-config-d2zwk\" (UID: \"c438fc60-ba0c-436d-906c-dd90d99a1e56\") " pod="openstack/ovn-controller-gczlr-config-d2zwk" Dec 01 20:16:57 crc kubenswrapper[4802]: I1201 20:16:57.001008 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c438fc60-ba0c-436d-906c-dd90d99a1e56-additional-scripts\") pod \"ovn-controller-gczlr-config-d2zwk\" (UID: \"c438fc60-ba0c-436d-906c-dd90d99a1e56\") " pod="openstack/ovn-controller-gczlr-config-d2zwk" Dec 01 20:16:57 crc kubenswrapper[4802]: I1201 20:16:57.079542 4802 generic.go:334] "Generic (PLEG): container finished" podID="97e35ed2-d0e5-4f29-8869-3740e22f5cd9" containerID="957695bc3f9738b3ae79ad36eac3d42d9c4d14740c1e9708f1951ad24b1d393f" exitCode=0 Dec 01 20:16:57 crc kubenswrapper[4802]: I1201 20:16:57.079940 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"97e35ed2-d0e5-4f29-8869-3740e22f5cd9","Type":"ContainerDied","Data":"957695bc3f9738b3ae79ad36eac3d42d9c4d14740c1e9708f1951ad24b1d393f"} Dec 01 20:16:57 crc kubenswrapper[4802]: I1201 20:16:57.103217 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c438fc60-ba0c-436d-906c-dd90d99a1e56-var-run\") pod \"ovn-controller-gczlr-config-d2zwk\" (UID: \"c438fc60-ba0c-436d-906c-dd90d99a1e56\") " pod="openstack/ovn-controller-gczlr-config-d2zwk" Dec 01 20:16:57 crc kubenswrapper[4802]: I1201 20:16:57.103261 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c438fc60-ba0c-436d-906c-dd90d99a1e56-var-log-ovn\") pod \"ovn-controller-gczlr-config-d2zwk\" (UID: \"c438fc60-ba0c-436d-906c-dd90d99a1e56\") " pod="openstack/ovn-controller-gczlr-config-d2zwk" Dec 01 20:16:57 crc kubenswrapper[4802]: I1201 20:16:57.103283 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztgxv\" (UniqueName: \"kubernetes.io/projected/c438fc60-ba0c-436d-906c-dd90d99a1e56-kube-api-access-ztgxv\") pod \"ovn-controller-gczlr-config-d2zwk\" (UID: \"c438fc60-ba0c-436d-906c-dd90d99a1e56\") " pod="openstack/ovn-controller-gczlr-config-d2zwk" Dec 01 20:16:57 crc kubenswrapper[4802]: I1201 20:16:57.103309 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c438fc60-ba0c-436d-906c-dd90d99a1e56-scripts\") pod \"ovn-controller-gczlr-config-d2zwk\" (UID: \"c438fc60-ba0c-436d-906c-dd90d99a1e56\") " pod="openstack/ovn-controller-gczlr-config-d2zwk" Dec 01 20:16:57 crc kubenswrapper[4802]: I1201 20:16:57.103333 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c438fc60-ba0c-436d-906c-dd90d99a1e56-additional-scripts\") pod \"ovn-controller-gczlr-config-d2zwk\" (UID: \"c438fc60-ba0c-436d-906c-dd90d99a1e56\") " pod="openstack/ovn-controller-gczlr-config-d2zwk" Dec 01 20:16:57 crc kubenswrapper[4802]: I1201 20:16:57.103379 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c438fc60-ba0c-436d-906c-dd90d99a1e56-var-run-ovn\") pod \"ovn-controller-gczlr-config-d2zwk\" (UID: \"c438fc60-ba0c-436d-906c-dd90d99a1e56\") " pod="openstack/ovn-controller-gczlr-config-d2zwk" Dec 01 20:16:57 crc kubenswrapper[4802]: I1201 20:16:57.103662 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c438fc60-ba0c-436d-906c-dd90d99a1e56-var-run-ovn\") pod \"ovn-controller-gczlr-config-d2zwk\" (UID: \"c438fc60-ba0c-436d-906c-dd90d99a1e56\") " pod="openstack/ovn-controller-gczlr-config-d2zwk" Dec 01 20:16:57 crc kubenswrapper[4802]: I1201 20:16:57.103707 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c438fc60-ba0c-436d-906c-dd90d99a1e56-var-run\") pod \"ovn-controller-gczlr-config-d2zwk\" (UID: \"c438fc60-ba0c-436d-906c-dd90d99a1e56\") " pod="openstack/ovn-controller-gczlr-config-d2zwk" Dec 01 20:16:57 crc kubenswrapper[4802]: I1201 20:16:57.103743 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c438fc60-ba0c-436d-906c-dd90d99a1e56-var-log-ovn\") pod \"ovn-controller-gczlr-config-d2zwk\" (UID: \"c438fc60-ba0c-436d-906c-dd90d99a1e56\") " pod="openstack/ovn-controller-gczlr-config-d2zwk" Dec 01 20:16:57 crc kubenswrapper[4802]: I1201 20:16:57.107117 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c438fc60-ba0c-436d-906c-dd90d99a1e56-scripts\") pod \"ovn-controller-gczlr-config-d2zwk\" (UID: \"c438fc60-ba0c-436d-906c-dd90d99a1e56\") " pod="openstack/ovn-controller-gczlr-config-d2zwk" Dec 01 20:16:57 crc kubenswrapper[4802]: I1201 20:16:57.107603 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c438fc60-ba0c-436d-906c-dd90d99a1e56-additional-scripts\") pod \"ovn-controller-gczlr-config-d2zwk\" (UID: \"c438fc60-ba0c-436d-906c-dd90d99a1e56\") " pod="openstack/ovn-controller-gczlr-config-d2zwk" Dec 01 20:16:57 crc kubenswrapper[4802]: I1201 20:16:57.126619 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztgxv\" (UniqueName: \"kubernetes.io/projected/c438fc60-ba0c-436d-906c-dd90d99a1e56-kube-api-access-ztgxv\") pod \"ovn-controller-gczlr-config-d2zwk\" (UID: \"c438fc60-ba0c-436d-906c-dd90d99a1e56\") " pod="openstack/ovn-controller-gczlr-config-d2zwk" Dec 01 20:16:57 crc kubenswrapper[4802]: I1201 20:16:57.188737 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gczlr-config-d2zwk" Dec 01 20:16:57 crc kubenswrapper[4802]: I1201 20:16:57.604327 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gczlr-config-d2zwk"] Dec 01 20:16:57 crc kubenswrapper[4802]: W1201 20:16:57.634603 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc438fc60_ba0c_436d_906c_dd90d99a1e56.slice/crio-a77f11f1a2673a410abd9e4e9fb94a55bb58416b2c3626d6f5076a7a7a29967d WatchSource:0}: Error finding container a77f11f1a2673a410abd9e4e9fb94a55bb58416b2c3626d6f5076a7a7a29967d: Status 404 returned error can't find the container with id a77f11f1a2673a410abd9e4e9fb94a55bb58416b2c3626d6f5076a7a7a29967d Dec 01 20:16:58 crc kubenswrapper[4802]: I1201 20:16:58.108364 4802 generic.go:334] "Generic (PLEG): container finished" podID="563ae8fc-e33c-402e-8901-79434cf68179" containerID="cfe5292ebff6919447e6cb75a9e6f7f38ca2cf0cfc55c891c7899f5e93c7c88a" exitCode=0 Dec 01 20:16:58 crc kubenswrapper[4802]: I1201 20:16:58.108468 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"563ae8fc-e33c-402e-8901-79434cf68179","Type":"ContainerDied","Data":"cfe5292ebff6919447e6cb75a9e6f7f38ca2cf0cfc55c891c7899f5e93c7c88a"} Dec 01 20:16:58 crc kubenswrapper[4802]: I1201 20:16:58.116355 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"97e35ed2-d0e5-4f29-8869-3740e22f5cd9","Type":"ContainerStarted","Data":"9a2ffad65cd4005ed7b8222a53e48dff7feff01c3dc4d96dfe89844ff33e7fa9"} Dec 01 20:16:58 crc kubenswrapper[4802]: I1201 20:16:58.116545 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:16:58 crc kubenswrapper[4802]: I1201 20:16:58.117724 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gczlr-config-d2zwk" event={"ID":"c438fc60-ba0c-436d-906c-dd90d99a1e56","Type":"ContainerStarted","Data":"229722e7a59e4b76f8fb2b1ae757a602669ff8478bd89105ba1637846c35b8c9"} Dec 01 20:16:58 crc kubenswrapper[4802]: I1201 20:16:58.117754 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gczlr-config-d2zwk" event={"ID":"c438fc60-ba0c-436d-906c-dd90d99a1e56","Type":"ContainerStarted","Data":"a77f11f1a2673a410abd9e4e9fb94a55bb58416b2c3626d6f5076a7a7a29967d"} Dec 01 20:16:58 crc kubenswrapper[4802]: I1201 20:16:58.176404 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.367202954 podStartE2EDuration="1m18.176382784s" podCreationTimestamp="2025-12-01 20:15:40 +0000 UTC" firstStartedPulling="2025-12-01 20:15:42.627983314 +0000 UTC m=+1164.190542945" lastFinishedPulling="2025-12-01 20:16:22.437163114 +0000 UTC m=+1203.999722775" observedRunningTime="2025-12-01 20:16:58.16226829 +0000 UTC m=+1239.724827951" watchObservedRunningTime="2025-12-01 20:16:58.176382784 +0000 UTC m=+1239.738942425" Dec 01 20:16:58 crc kubenswrapper[4802]: I1201 20:16:58.188412 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-gczlr-config-d2zwk" podStartSLOduration=2.188393172 podStartE2EDuration="2.188393172s" podCreationTimestamp="2025-12-01 20:16:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:16:58.180809202 +0000 UTC m=+1239.743368863" watchObservedRunningTime="2025-12-01 20:16:58.188393172 +0000 UTC m=+1239.750952803" Dec 01 20:16:59 crc kubenswrapper[4802]: I1201 20:16:59.149016 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"563ae8fc-e33c-402e-8901-79434cf68179","Type":"ContainerStarted","Data":"d549f413aff5e0e61b86a2f0b602c98653afe32757995bc57f3ef7072fec4a28"} Dec 01 20:16:59 crc kubenswrapper[4802]: I1201 20:16:59.149529 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 01 20:16:59 crc kubenswrapper[4802]: I1201 20:16:59.152500 4802 generic.go:334] "Generic (PLEG): container finished" podID="c438fc60-ba0c-436d-906c-dd90d99a1e56" containerID="229722e7a59e4b76f8fb2b1ae757a602669ff8478bd89105ba1637846c35b8c9" exitCode=0 Dec 01 20:16:59 crc kubenswrapper[4802]: I1201 20:16:59.152551 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gczlr-config-d2zwk" event={"ID":"c438fc60-ba0c-436d-906c-dd90d99a1e56","Type":"ContainerDied","Data":"229722e7a59e4b76f8fb2b1ae757a602669ff8478bd89105ba1637846c35b8c9"} Dec 01 20:16:59 crc kubenswrapper[4802]: I1201 20:16:59.176556 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.912225482 podStartE2EDuration="1m18.176539598s" podCreationTimestamp="2025-12-01 20:15:41 +0000 UTC" firstStartedPulling="2025-12-01 20:15:43.349828146 +0000 UTC m=+1164.912387787" lastFinishedPulling="2025-12-01 20:16:22.614142262 +0000 UTC m=+1204.176701903" observedRunningTime="2025-12-01 20:16:59.172855961 +0000 UTC m=+1240.735415612" watchObservedRunningTime="2025-12-01 20:16:59.176539598 +0000 UTC m=+1240.739099239" Dec 01 20:17:01 crc kubenswrapper[4802]: I1201 20:17:01.554904 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-gczlr" Dec 01 20:17:09 crc kubenswrapper[4802]: I1201 20:17:09.194877 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gczlr-config-d2zwk" Dec 01 20:17:09 crc kubenswrapper[4802]: I1201 20:17:09.249719 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c438fc60-ba0c-436d-906c-dd90d99a1e56-var-run\") pod \"c438fc60-ba0c-436d-906c-dd90d99a1e56\" (UID: \"c438fc60-ba0c-436d-906c-dd90d99a1e56\") " Dec 01 20:17:09 crc kubenswrapper[4802]: I1201 20:17:09.249809 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c438fc60-ba0c-436d-906c-dd90d99a1e56-var-run" (OuterVolumeSpecName: "var-run") pod "c438fc60-ba0c-436d-906c-dd90d99a1e56" (UID: "c438fc60-ba0c-436d-906c-dd90d99a1e56"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:17:09 crc kubenswrapper[4802]: I1201 20:17:09.249826 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c438fc60-ba0c-436d-906c-dd90d99a1e56-scripts\") pod \"c438fc60-ba0c-436d-906c-dd90d99a1e56\" (UID: \"c438fc60-ba0c-436d-906c-dd90d99a1e56\") " Dec 01 20:17:09 crc kubenswrapper[4802]: I1201 20:17:09.249985 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c438fc60-ba0c-436d-906c-dd90d99a1e56-var-run-ovn\") pod \"c438fc60-ba0c-436d-906c-dd90d99a1e56\" (UID: \"c438fc60-ba0c-436d-906c-dd90d99a1e56\") " Dec 01 20:17:09 crc kubenswrapper[4802]: I1201 20:17:09.250051 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c438fc60-ba0c-436d-906c-dd90d99a1e56-additional-scripts\") pod \"c438fc60-ba0c-436d-906c-dd90d99a1e56\" (UID: \"c438fc60-ba0c-436d-906c-dd90d99a1e56\") " Dec 01 20:17:09 crc kubenswrapper[4802]: I1201 20:17:09.250083 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c438fc60-ba0c-436d-906c-dd90d99a1e56-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c438fc60-ba0c-436d-906c-dd90d99a1e56" (UID: "c438fc60-ba0c-436d-906c-dd90d99a1e56"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:17:09 crc kubenswrapper[4802]: I1201 20:17:09.250959 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c438fc60-ba0c-436d-906c-dd90d99a1e56-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c438fc60-ba0c-436d-906c-dd90d99a1e56" (UID: "c438fc60-ba0c-436d-906c-dd90d99a1e56"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:17:09 crc kubenswrapper[4802]: I1201 20:17:09.251020 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztgxv\" (UniqueName: \"kubernetes.io/projected/c438fc60-ba0c-436d-906c-dd90d99a1e56-kube-api-access-ztgxv\") pod \"c438fc60-ba0c-436d-906c-dd90d99a1e56\" (UID: \"c438fc60-ba0c-436d-906c-dd90d99a1e56\") " Dec 01 20:17:09 crc kubenswrapper[4802]: I1201 20:17:09.251448 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c438fc60-ba0c-436d-906c-dd90d99a1e56-var-log-ovn\") pod \"c438fc60-ba0c-436d-906c-dd90d99a1e56\" (UID: \"c438fc60-ba0c-436d-906c-dd90d99a1e56\") " Dec 01 20:17:09 crc kubenswrapper[4802]: I1201 20:17:09.251814 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c438fc60-ba0c-436d-906c-dd90d99a1e56-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c438fc60-ba0c-436d-906c-dd90d99a1e56" (UID: "c438fc60-ba0c-436d-906c-dd90d99a1e56"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:17:09 crc kubenswrapper[4802]: I1201 20:17:09.252148 4802 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c438fc60-ba0c-436d-906c-dd90d99a1e56-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:09 crc kubenswrapper[4802]: I1201 20:17:09.252170 4802 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c438fc60-ba0c-436d-906c-dd90d99a1e56-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:09 crc kubenswrapper[4802]: I1201 20:17:09.252183 4802 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c438fc60-ba0c-436d-906c-dd90d99a1e56-var-run\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:09 crc kubenswrapper[4802]: I1201 20:17:09.252247 4802 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c438fc60-ba0c-436d-906c-dd90d99a1e56-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:09 crc kubenswrapper[4802]: I1201 20:17:09.252478 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c438fc60-ba0c-436d-906c-dd90d99a1e56-scripts" (OuterVolumeSpecName: "scripts") pod "c438fc60-ba0c-436d-906c-dd90d99a1e56" (UID: "c438fc60-ba0c-436d-906c-dd90d99a1e56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:17:09 crc kubenswrapper[4802]: I1201 20:17:09.255360 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c438fc60-ba0c-436d-906c-dd90d99a1e56-kube-api-access-ztgxv" (OuterVolumeSpecName: "kube-api-access-ztgxv") pod "c438fc60-ba0c-436d-906c-dd90d99a1e56" (UID: "c438fc60-ba0c-436d-906c-dd90d99a1e56"). InnerVolumeSpecName "kube-api-access-ztgxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:17:09 crc kubenswrapper[4802]: I1201 20:17:09.325748 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gczlr-config-d2zwk" event={"ID":"c438fc60-ba0c-436d-906c-dd90d99a1e56","Type":"ContainerDied","Data":"a77f11f1a2673a410abd9e4e9fb94a55bb58416b2c3626d6f5076a7a7a29967d"} Dec 01 20:17:09 crc kubenswrapper[4802]: I1201 20:17:09.325788 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a77f11f1a2673a410abd9e4e9fb94a55bb58416b2c3626d6f5076a7a7a29967d" Dec 01 20:17:09 crc kubenswrapper[4802]: I1201 20:17:09.325834 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gczlr-config-d2zwk" Dec 01 20:17:09 crc kubenswrapper[4802]: I1201 20:17:09.353513 4802 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c438fc60-ba0c-436d-906c-dd90d99a1e56-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:09 crc kubenswrapper[4802]: I1201 20:17:09.353552 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztgxv\" (UniqueName: \"kubernetes.io/projected/c438fc60-ba0c-436d-906c-dd90d99a1e56-kube-api-access-ztgxv\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:10 crc kubenswrapper[4802]: I1201 20:17:10.318810 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-gczlr-config-d2zwk"] Dec 01 20:17:10 crc kubenswrapper[4802]: I1201 20:17:10.327556 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-gczlr-config-d2zwk"] Dec 01 20:17:10 crc kubenswrapper[4802]: I1201 20:17:10.336645 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rsnxk" event={"ID":"7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a","Type":"ContainerStarted","Data":"5a196846fc76864e8912d6493edb4c1b3c7899dd992ec1455f2b53c61650588e"} Dec 01 20:17:10 crc kubenswrapper[4802]: I1201 20:17:10.356156 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-rsnxk" podStartSLOduration=2.265007977 podStartE2EDuration="19.356138016s" podCreationTimestamp="2025-12-01 20:16:51 +0000 UTC" firstStartedPulling="2025-12-01 20:16:52.108968669 +0000 UTC m=+1233.671528330" lastFinishedPulling="2025-12-01 20:17:09.200098708 +0000 UTC m=+1250.762658369" observedRunningTime="2025-12-01 20:17:10.349403535 +0000 UTC m=+1251.911963176" watchObservedRunningTime="2025-12-01 20:17:10.356138016 +0000 UTC m=+1251.918697657" Dec 01 20:17:10 crc kubenswrapper[4802]: I1201 20:17:10.732363 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c438fc60-ba0c-436d-906c-dd90d99a1e56" path="/var/lib/kubelet/pods/c438fc60-ba0c-436d-906c-dd90d99a1e56/volumes" Dec 01 20:17:12 crc kubenswrapper[4802]: I1201 20:17:12.140514 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:17:12 crc kubenswrapper[4802]: I1201 20:17:12.487840 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.551174 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-mhtn2"] Dec 01 20:17:14 crc kubenswrapper[4802]: E1201 20:17:14.551861 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c438fc60-ba0c-436d-906c-dd90d99a1e56" containerName="ovn-config" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.551875 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="c438fc60-ba0c-436d-906c-dd90d99a1e56" containerName="ovn-config" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.552051 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="c438fc60-ba0c-436d-906c-dd90d99a1e56" containerName="ovn-config" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.552603 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mhtn2" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.560430 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-mhtn2"] Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.637643 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-5c267"] Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.639014 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5c267" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.653259 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-5c267"] Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.668572 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-26c0-account-create-update-pmhgj"] Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.670033 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-26c0-account-create-update-pmhgj" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.671673 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.700749 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-26c0-account-create-update-pmhgj"] Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.711629 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0448f2b2-4e07-4fa4-85ea-0c37f6664dcd-operator-scripts\") pod \"cinder-db-create-mhtn2\" (UID: \"0448f2b2-4e07-4fa4-85ea-0c37f6664dcd\") " pod="openstack/cinder-db-create-mhtn2" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.711773 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6g79\" (UniqueName: \"kubernetes.io/projected/0448f2b2-4e07-4fa4-85ea-0c37f6664dcd-kube-api-access-s6g79\") pod \"cinder-db-create-mhtn2\" (UID: \"0448f2b2-4e07-4fa4-85ea-0c37f6664dcd\") " pod="openstack/cinder-db-create-mhtn2" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.747758 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-371f-account-create-update-8gtw9"] Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.748860 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-371f-account-create-update-8gtw9" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.756964 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.760586 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-371f-account-create-update-8gtw9"] Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.809150 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-v6674"] Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.810121 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-v6674" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.812317 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.812507 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.812715 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jd99d" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.812973 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0448f2b2-4e07-4fa4-85ea-0c37f6664dcd-operator-scripts\") pod \"cinder-db-create-mhtn2\" (UID: \"0448f2b2-4e07-4fa4-85ea-0c37f6664dcd\") " pod="openstack/cinder-db-create-mhtn2" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.813068 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80050873-f4d6-4468-ba9c-7f5090932b88-operator-scripts\") pod \"cinder-26c0-account-create-update-pmhgj\" (UID: \"80050873-f4d6-4468-ba9c-7f5090932b88\") " pod="openstack/cinder-26c0-account-create-update-pmhgj" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.813099 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5xhg\" (UniqueName: \"kubernetes.io/projected/80050873-f4d6-4468-ba9c-7f5090932b88-kube-api-access-s5xhg\") pod \"cinder-26c0-account-create-update-pmhgj\" (UID: \"80050873-f4d6-4468-ba9c-7f5090932b88\") " pod="openstack/cinder-26c0-account-create-update-pmhgj" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.813131 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qg65\" (UniqueName: \"kubernetes.io/projected/30c7ca3c-3346-442b-b9c8-54b8beb87e8e-kube-api-access-6qg65\") pod \"barbican-db-create-5c267\" (UID: \"30c7ca3c-3346-442b-b9c8-54b8beb87e8e\") " pod="openstack/barbican-db-create-5c267" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.813223 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.813263 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6g79\" (UniqueName: \"kubernetes.io/projected/0448f2b2-4e07-4fa4-85ea-0c37f6664dcd-kube-api-access-s6g79\") pod \"cinder-db-create-mhtn2\" (UID: \"0448f2b2-4e07-4fa4-85ea-0c37f6664dcd\") " pod="openstack/cinder-db-create-mhtn2" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.813339 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30c7ca3c-3346-442b-b9c8-54b8beb87e8e-operator-scripts\") pod \"barbican-db-create-5c267\" (UID: \"30c7ca3c-3346-442b-b9c8-54b8beb87e8e\") " pod="openstack/barbican-db-create-5c267" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.814064 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0448f2b2-4e07-4fa4-85ea-0c37f6664dcd-operator-scripts\") pod \"cinder-db-create-mhtn2\" (UID: \"0448f2b2-4e07-4fa4-85ea-0c37f6664dcd\") " pod="openstack/cinder-db-create-mhtn2" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.822218 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-v6674"] Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.833736 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6g79\" (UniqueName: \"kubernetes.io/projected/0448f2b2-4e07-4fa4-85ea-0c37f6664dcd-kube-api-access-s6g79\") pod \"cinder-db-create-mhtn2\" (UID: \"0448f2b2-4e07-4fa4-85ea-0c37f6664dcd\") " pod="openstack/cinder-db-create-mhtn2" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.872989 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mhtn2" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.915504 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30c7ca3c-3346-442b-b9c8-54b8beb87e8e-operator-scripts\") pod \"barbican-db-create-5c267\" (UID: \"30c7ca3c-3346-442b-b9c8-54b8beb87e8e\") " pod="openstack/barbican-db-create-5c267" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.915808 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2dsv\" (UniqueName: \"kubernetes.io/projected/4551c7b2-d51a-45d2-a645-58bd0da669c5-kube-api-access-j2dsv\") pod \"barbican-371f-account-create-update-8gtw9\" (UID: \"4551c7b2-d51a-45d2-a645-58bd0da669c5\") " pod="openstack/barbican-371f-account-create-update-8gtw9" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.915845 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8852056c-2330-4f2b-acd3-89442f05e8c9-config-data\") pod \"keystone-db-sync-v6674\" (UID: \"8852056c-2330-4f2b-acd3-89442f05e8c9\") " pod="openstack/keystone-db-sync-v6674" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.915939 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8852056c-2330-4f2b-acd3-89442f05e8c9-combined-ca-bundle\") pod \"keystone-db-sync-v6674\" (UID: \"8852056c-2330-4f2b-acd3-89442f05e8c9\") " pod="openstack/keystone-db-sync-v6674" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.916035 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4551c7b2-d51a-45d2-a645-58bd0da669c5-operator-scripts\") pod \"barbican-371f-account-create-update-8gtw9\" (UID: \"4551c7b2-d51a-45d2-a645-58bd0da669c5\") " pod="openstack/barbican-371f-account-create-update-8gtw9" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.916074 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80050873-f4d6-4468-ba9c-7f5090932b88-operator-scripts\") pod \"cinder-26c0-account-create-update-pmhgj\" (UID: \"80050873-f4d6-4468-ba9c-7f5090932b88\") " pod="openstack/cinder-26c0-account-create-update-pmhgj" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.916119 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5xhg\" (UniqueName: \"kubernetes.io/projected/80050873-f4d6-4468-ba9c-7f5090932b88-kube-api-access-s5xhg\") pod \"cinder-26c0-account-create-update-pmhgj\" (UID: \"80050873-f4d6-4468-ba9c-7f5090932b88\") " pod="openstack/cinder-26c0-account-create-update-pmhgj" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.916396 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30c7ca3c-3346-442b-b9c8-54b8beb87e8e-operator-scripts\") pod \"barbican-db-create-5c267\" (UID: \"30c7ca3c-3346-442b-b9c8-54b8beb87e8e\") " pod="openstack/barbican-db-create-5c267" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.916141 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qg65\" (UniqueName: \"kubernetes.io/projected/30c7ca3c-3346-442b-b9c8-54b8beb87e8e-kube-api-access-6qg65\") pod \"barbican-db-create-5c267\" (UID: \"30c7ca3c-3346-442b-b9c8-54b8beb87e8e\") " pod="openstack/barbican-db-create-5c267" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.916490 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw2dx\" (UniqueName: \"kubernetes.io/projected/8852056c-2330-4f2b-acd3-89442f05e8c9-kube-api-access-fw2dx\") pod \"keystone-db-sync-v6674\" (UID: \"8852056c-2330-4f2b-acd3-89442f05e8c9\") " pod="openstack/keystone-db-sync-v6674" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.917371 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80050873-f4d6-4468-ba9c-7f5090932b88-operator-scripts\") pod \"cinder-26c0-account-create-update-pmhgj\" (UID: \"80050873-f4d6-4468-ba9c-7f5090932b88\") " pod="openstack/cinder-26c0-account-create-update-pmhgj" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.947187 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-hcph5"] Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.948760 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qg65\" (UniqueName: \"kubernetes.io/projected/30c7ca3c-3346-442b-b9c8-54b8beb87e8e-kube-api-access-6qg65\") pod \"barbican-db-create-5c267\" (UID: \"30c7ca3c-3346-442b-b9c8-54b8beb87e8e\") " pod="openstack/barbican-db-create-5c267" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.949967 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hcph5" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.954657 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5c267" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.955668 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5xhg\" (UniqueName: \"kubernetes.io/projected/80050873-f4d6-4468-ba9c-7f5090932b88-kube-api-access-s5xhg\") pod \"cinder-26c0-account-create-update-pmhgj\" (UID: \"80050873-f4d6-4468-ba9c-7f5090932b88\") " pod="openstack/cinder-26c0-account-create-update-pmhgj" Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.978179 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hcph5"] Dec 01 20:17:14 crc kubenswrapper[4802]: I1201 20:17:14.985033 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-26c0-account-create-update-pmhgj" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.024658 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2dsv\" (UniqueName: \"kubernetes.io/projected/4551c7b2-d51a-45d2-a645-58bd0da669c5-kube-api-access-j2dsv\") pod \"barbican-371f-account-create-update-8gtw9\" (UID: \"4551c7b2-d51a-45d2-a645-58bd0da669c5\") " pod="openstack/barbican-371f-account-create-update-8gtw9" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.024733 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8852056c-2330-4f2b-acd3-89442f05e8c9-config-data\") pod \"keystone-db-sync-v6674\" (UID: \"8852056c-2330-4f2b-acd3-89442f05e8c9\") " pod="openstack/keystone-db-sync-v6674" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.024801 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8852056c-2330-4f2b-acd3-89442f05e8c9-combined-ca-bundle\") pod \"keystone-db-sync-v6674\" (UID: \"8852056c-2330-4f2b-acd3-89442f05e8c9\") " pod="openstack/keystone-db-sync-v6674" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.024838 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4551c7b2-d51a-45d2-a645-58bd0da669c5-operator-scripts\") pod \"barbican-371f-account-create-update-8gtw9\" (UID: \"4551c7b2-d51a-45d2-a645-58bd0da669c5\") " pod="openstack/barbican-371f-account-create-update-8gtw9" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.024931 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw2dx\" (UniqueName: \"kubernetes.io/projected/8852056c-2330-4f2b-acd3-89442f05e8c9-kube-api-access-fw2dx\") pod \"keystone-db-sync-v6674\" (UID: \"8852056c-2330-4f2b-acd3-89442f05e8c9\") " pod="openstack/keystone-db-sync-v6674" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.027094 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4551c7b2-d51a-45d2-a645-58bd0da669c5-operator-scripts\") pod \"barbican-371f-account-create-update-8gtw9\" (UID: \"4551c7b2-d51a-45d2-a645-58bd0da669c5\") " pod="openstack/barbican-371f-account-create-update-8gtw9" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.037829 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8852056c-2330-4f2b-acd3-89442f05e8c9-config-data\") pod \"keystone-db-sync-v6674\" (UID: \"8852056c-2330-4f2b-acd3-89442f05e8c9\") " pod="openstack/keystone-db-sync-v6674" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.044749 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8852056c-2330-4f2b-acd3-89442f05e8c9-combined-ca-bundle\") pod \"keystone-db-sync-v6674\" (UID: \"8852056c-2330-4f2b-acd3-89442f05e8c9\") " pod="openstack/keystone-db-sync-v6674" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.046714 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2dsv\" (UniqueName: \"kubernetes.io/projected/4551c7b2-d51a-45d2-a645-58bd0da669c5-kube-api-access-j2dsv\") pod \"barbican-371f-account-create-update-8gtw9\" (UID: \"4551c7b2-d51a-45d2-a645-58bd0da669c5\") " pod="openstack/barbican-371f-account-create-update-8gtw9" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.047920 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw2dx\" (UniqueName: \"kubernetes.io/projected/8852056c-2330-4f2b-acd3-89442f05e8c9-kube-api-access-fw2dx\") pod \"keystone-db-sync-v6674\" (UID: \"8852056c-2330-4f2b-acd3-89442f05e8c9\") " pod="openstack/keystone-db-sync-v6674" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.086285 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-4560-account-create-update-256g2"] Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.089582 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4560-account-create-update-256g2" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.092082 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.094547 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-371f-account-create-update-8gtw9" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.118132 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4560-account-create-update-256g2"] Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.127463 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-v6674" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.128578 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djmpw\" (UniqueName: \"kubernetes.io/projected/0b5c1ebd-d03b-4914-a8fd-498a3cd06581-kube-api-access-djmpw\") pod \"neutron-db-create-hcph5\" (UID: \"0b5c1ebd-d03b-4914-a8fd-498a3cd06581\") " pod="openstack/neutron-db-create-hcph5" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.128680 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b5c1ebd-d03b-4914-a8fd-498a3cd06581-operator-scripts\") pod \"neutron-db-create-hcph5\" (UID: \"0b5c1ebd-d03b-4914-a8fd-498a3cd06581\") " pod="openstack/neutron-db-create-hcph5" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.230377 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fqrj\" (UniqueName: \"kubernetes.io/projected/3f7b6d98-cecf-4243-b6b9-b217eb229549-kube-api-access-9fqrj\") pod \"neutron-4560-account-create-update-256g2\" (UID: \"3f7b6d98-cecf-4243-b6b9-b217eb229549\") " pod="openstack/neutron-4560-account-create-update-256g2" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.230431 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b5c1ebd-d03b-4914-a8fd-498a3cd06581-operator-scripts\") pod \"neutron-db-create-hcph5\" (UID: \"0b5c1ebd-d03b-4914-a8fd-498a3cd06581\") " pod="openstack/neutron-db-create-hcph5" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.230510 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djmpw\" (UniqueName: \"kubernetes.io/projected/0b5c1ebd-d03b-4914-a8fd-498a3cd06581-kube-api-access-djmpw\") pod \"neutron-db-create-hcph5\" (UID: \"0b5c1ebd-d03b-4914-a8fd-498a3cd06581\") " pod="openstack/neutron-db-create-hcph5" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.230550 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f7b6d98-cecf-4243-b6b9-b217eb229549-operator-scripts\") pod \"neutron-4560-account-create-update-256g2\" (UID: \"3f7b6d98-cecf-4243-b6b9-b217eb229549\") " pod="openstack/neutron-4560-account-create-update-256g2" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.231788 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-mhtn2"] Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.233034 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b5c1ebd-d03b-4914-a8fd-498a3cd06581-operator-scripts\") pod \"neutron-db-create-hcph5\" (UID: \"0b5c1ebd-d03b-4914-a8fd-498a3cd06581\") " pod="openstack/neutron-db-create-hcph5" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.259865 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djmpw\" (UniqueName: \"kubernetes.io/projected/0b5c1ebd-d03b-4914-a8fd-498a3cd06581-kube-api-access-djmpw\") pod \"neutron-db-create-hcph5\" (UID: \"0b5c1ebd-d03b-4914-a8fd-498a3cd06581\") " pod="openstack/neutron-db-create-hcph5" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.332033 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fqrj\" (UniqueName: \"kubernetes.io/projected/3f7b6d98-cecf-4243-b6b9-b217eb229549-kube-api-access-9fqrj\") pod \"neutron-4560-account-create-update-256g2\" (UID: \"3f7b6d98-cecf-4243-b6b9-b217eb229549\") " pod="openstack/neutron-4560-account-create-update-256g2" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.332168 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f7b6d98-cecf-4243-b6b9-b217eb229549-operator-scripts\") pod \"neutron-4560-account-create-update-256g2\" (UID: \"3f7b6d98-cecf-4243-b6b9-b217eb229549\") " pod="openstack/neutron-4560-account-create-update-256g2" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.336689 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f7b6d98-cecf-4243-b6b9-b217eb229549-operator-scripts\") pod \"neutron-4560-account-create-update-256g2\" (UID: \"3f7b6d98-cecf-4243-b6b9-b217eb229549\") " pod="openstack/neutron-4560-account-create-update-256g2" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.354483 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fqrj\" (UniqueName: \"kubernetes.io/projected/3f7b6d98-cecf-4243-b6b9-b217eb229549-kube-api-access-9fqrj\") pod \"neutron-4560-account-create-update-256g2\" (UID: \"3f7b6d98-cecf-4243-b6b9-b217eb229549\") " pod="openstack/neutron-4560-account-create-update-256g2" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.404999 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mhtn2" event={"ID":"0448f2b2-4e07-4fa4-85ea-0c37f6664dcd","Type":"ContainerStarted","Data":"87b6ffd6ce1cd6a272489390cd822c1d4deb2ba8f8db0a4ae79ad1d3b9f3b24f"} Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.419738 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hcph5" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.428132 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4560-account-create-update-256g2" Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.636548 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-26c0-account-create-update-pmhgj"] Dec 01 20:17:15 crc kubenswrapper[4802]: W1201 20:17:15.639143 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80050873_f4d6_4468_ba9c_7f5090932b88.slice/crio-dc58653082051ba2f6e2a00741115c67d8fdf1d6843164bf733295b51920014d WatchSource:0}: Error finding container dc58653082051ba2f6e2a00741115c67d8fdf1d6843164bf733295b51920014d: Status 404 returned error can't find the container with id dc58653082051ba2f6e2a00741115c67d8fdf1d6843164bf733295b51920014d Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.684407 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-5c267"] Dec 01 20:17:15 crc kubenswrapper[4802]: W1201 20:17:15.689328 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30c7ca3c_3346_442b_b9c8_54b8beb87e8e.slice/crio-9a6d9602316bb603dda48420179e2d0f52cedc8074710e6ff3ab94fc54e5085b WatchSource:0}: Error finding container 9a6d9602316bb603dda48420179e2d0f52cedc8074710e6ff3ab94fc54e5085b: Status 404 returned error can't find the container with id 9a6d9602316bb603dda48420179e2d0f52cedc8074710e6ff3ab94fc54e5085b Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.731329 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-371f-account-create-update-8gtw9"] Dec 01 20:17:15 crc kubenswrapper[4802]: W1201 20:17:15.740311 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4551c7b2_d51a_45d2_a645_58bd0da669c5.slice/crio-6ed9bcf991cd99da65e1ea8bd6c67db48430ee5397a87069e60386e7ffce419f WatchSource:0}: Error finding container 6ed9bcf991cd99da65e1ea8bd6c67db48430ee5397a87069e60386e7ffce419f: Status 404 returned error can't find the container with id 6ed9bcf991cd99da65e1ea8bd6c67db48430ee5397a87069e60386e7ffce419f Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.798950 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hcph5"] Dec 01 20:17:15 crc kubenswrapper[4802]: W1201 20:17:15.813497 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b5c1ebd_d03b_4914_a8fd_498a3cd06581.slice/crio-cf67325d7be662a1031c2f11aa252690f7eaeb501c514a08b77ad9927b1b1f2b WatchSource:0}: Error finding container cf67325d7be662a1031c2f11aa252690f7eaeb501c514a08b77ad9927b1b1f2b: Status 404 returned error can't find the container with id cf67325d7be662a1031c2f11aa252690f7eaeb501c514a08b77ad9927b1b1f2b Dec 01 20:17:15 crc kubenswrapper[4802]: I1201 20:17:15.817547 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-v6674"] Dec 01 20:17:16 crc kubenswrapper[4802]: I1201 20:17:16.074458 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4560-account-create-update-256g2"] Dec 01 20:17:16 crc kubenswrapper[4802]: W1201 20:17:16.103091 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f7b6d98_cecf_4243_b6b9_b217eb229549.slice/crio-19985ffd0e079df025988450263d4b5bfeb2eeae4de53a0f674af60a83a73fee WatchSource:0}: Error finding container 19985ffd0e079df025988450263d4b5bfeb2eeae4de53a0f674af60a83a73fee: Status 404 returned error can't find the container with id 19985ffd0e079df025988450263d4b5bfeb2eeae4de53a0f674af60a83a73fee Dec 01 20:17:16 crc kubenswrapper[4802]: I1201 20:17:16.414464 4802 generic.go:334] "Generic (PLEG): container finished" podID="80050873-f4d6-4468-ba9c-7f5090932b88" containerID="9dcce808985a3fd15a2baeaa92a62b9648efe6baecafa319ea7e7138de935a7c" exitCode=0 Dec 01 20:17:16 crc kubenswrapper[4802]: I1201 20:17:16.414508 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-26c0-account-create-update-pmhgj" event={"ID":"80050873-f4d6-4468-ba9c-7f5090932b88","Type":"ContainerDied","Data":"9dcce808985a3fd15a2baeaa92a62b9648efe6baecafa319ea7e7138de935a7c"} Dec 01 20:17:16 crc kubenswrapper[4802]: I1201 20:17:16.414740 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-26c0-account-create-update-pmhgj" event={"ID":"80050873-f4d6-4468-ba9c-7f5090932b88","Type":"ContainerStarted","Data":"dc58653082051ba2f6e2a00741115c67d8fdf1d6843164bf733295b51920014d"} Dec 01 20:17:16 crc kubenswrapper[4802]: I1201 20:17:16.416826 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-371f-account-create-update-8gtw9" event={"ID":"4551c7b2-d51a-45d2-a645-58bd0da669c5","Type":"ContainerStarted","Data":"c6ec4adbd9890cafa668a95709aa21fd99a23e7486e2fdf2b6dd85856a007083"} Dec 01 20:17:16 crc kubenswrapper[4802]: I1201 20:17:16.416867 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-371f-account-create-update-8gtw9" event={"ID":"4551c7b2-d51a-45d2-a645-58bd0da669c5","Type":"ContainerStarted","Data":"6ed9bcf991cd99da65e1ea8bd6c67db48430ee5397a87069e60386e7ffce419f"} Dec 01 20:17:16 crc kubenswrapper[4802]: I1201 20:17:16.418421 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-v6674" event={"ID":"8852056c-2330-4f2b-acd3-89442f05e8c9","Type":"ContainerStarted","Data":"a2196d0982079ca76fa8ae17b84c9c4fb9c4d1014d4bb376d747098ee66259e0"} Dec 01 20:17:16 crc kubenswrapper[4802]: I1201 20:17:16.420325 4802 generic.go:334] "Generic (PLEG): container finished" podID="0b5c1ebd-d03b-4914-a8fd-498a3cd06581" containerID="78750f258f78bf0823a33af908db5c118b068bf4427190a3f0d55001944ebb63" exitCode=0 Dec 01 20:17:16 crc kubenswrapper[4802]: I1201 20:17:16.420368 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hcph5" event={"ID":"0b5c1ebd-d03b-4914-a8fd-498a3cd06581","Type":"ContainerDied","Data":"78750f258f78bf0823a33af908db5c118b068bf4427190a3f0d55001944ebb63"} Dec 01 20:17:16 crc kubenswrapper[4802]: I1201 20:17:16.420422 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hcph5" event={"ID":"0b5c1ebd-d03b-4914-a8fd-498a3cd06581","Type":"ContainerStarted","Data":"cf67325d7be662a1031c2f11aa252690f7eaeb501c514a08b77ad9927b1b1f2b"} Dec 01 20:17:16 crc kubenswrapper[4802]: I1201 20:17:16.422640 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4560-account-create-update-256g2" event={"ID":"3f7b6d98-cecf-4243-b6b9-b217eb229549","Type":"ContainerStarted","Data":"6340a4eb621303ea2058895ebb55c94ffff1788ceb040708bd567bc75f12f3fb"} Dec 01 20:17:16 crc kubenswrapper[4802]: I1201 20:17:16.422682 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4560-account-create-update-256g2" event={"ID":"3f7b6d98-cecf-4243-b6b9-b217eb229549","Type":"ContainerStarted","Data":"19985ffd0e079df025988450263d4b5bfeb2eeae4de53a0f674af60a83a73fee"} Dec 01 20:17:16 crc kubenswrapper[4802]: I1201 20:17:16.424355 4802 generic.go:334] "Generic (PLEG): container finished" podID="30c7ca3c-3346-442b-b9c8-54b8beb87e8e" containerID="443cb30caf2603f77867072fbf034809cacd5027ad575f34ed8d81c3d32b05d7" exitCode=0 Dec 01 20:17:16 crc kubenswrapper[4802]: I1201 20:17:16.424432 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-5c267" event={"ID":"30c7ca3c-3346-442b-b9c8-54b8beb87e8e","Type":"ContainerDied","Data":"443cb30caf2603f77867072fbf034809cacd5027ad575f34ed8d81c3d32b05d7"} Dec 01 20:17:16 crc kubenswrapper[4802]: I1201 20:17:16.424471 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-5c267" event={"ID":"30c7ca3c-3346-442b-b9c8-54b8beb87e8e","Type":"ContainerStarted","Data":"9a6d9602316bb603dda48420179e2d0f52cedc8074710e6ff3ab94fc54e5085b"} Dec 01 20:17:16 crc kubenswrapper[4802]: I1201 20:17:16.426264 4802 generic.go:334] "Generic (PLEG): container finished" podID="0448f2b2-4e07-4fa4-85ea-0c37f6664dcd" containerID="199ca1f92b4971d21f75865bd3b37a30f91dd77268f1281d6980f1e1f07e2410" exitCode=0 Dec 01 20:17:16 crc kubenswrapper[4802]: I1201 20:17:16.426308 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mhtn2" event={"ID":"0448f2b2-4e07-4fa4-85ea-0c37f6664dcd","Type":"ContainerDied","Data":"199ca1f92b4971d21f75865bd3b37a30f91dd77268f1281d6980f1e1f07e2410"} Dec 01 20:17:16 crc kubenswrapper[4802]: I1201 20:17:16.496587 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-371f-account-create-update-8gtw9" podStartSLOduration=2.496559317 podStartE2EDuration="2.496559317s" podCreationTimestamp="2025-12-01 20:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:17:16.484049974 +0000 UTC m=+1258.046609615" watchObservedRunningTime="2025-12-01 20:17:16.496559317 +0000 UTC m=+1258.059118978" Dec 01 20:17:16 crc kubenswrapper[4802]: I1201 20:17:16.526447 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-4560-account-create-update-256g2" podStartSLOduration=1.526425337 podStartE2EDuration="1.526425337s" podCreationTimestamp="2025-12-01 20:17:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:17:16.522115072 +0000 UTC m=+1258.084674743" watchObservedRunningTime="2025-12-01 20:17:16.526425337 +0000 UTC m=+1258.088984978" Dec 01 20:17:17 crc kubenswrapper[4802]: I1201 20:17:17.441598 4802 generic.go:334] "Generic (PLEG): container finished" podID="4551c7b2-d51a-45d2-a645-58bd0da669c5" containerID="c6ec4adbd9890cafa668a95709aa21fd99a23e7486e2fdf2b6dd85856a007083" exitCode=0 Dec 01 20:17:17 crc kubenswrapper[4802]: I1201 20:17:17.441639 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-371f-account-create-update-8gtw9" event={"ID":"4551c7b2-d51a-45d2-a645-58bd0da669c5","Type":"ContainerDied","Data":"c6ec4adbd9890cafa668a95709aa21fd99a23e7486e2fdf2b6dd85856a007083"} Dec 01 20:17:17 crc kubenswrapper[4802]: I1201 20:17:17.443784 4802 generic.go:334] "Generic (PLEG): container finished" podID="3f7b6d98-cecf-4243-b6b9-b217eb229549" containerID="6340a4eb621303ea2058895ebb55c94ffff1788ceb040708bd567bc75f12f3fb" exitCode=0 Dec 01 20:17:17 crc kubenswrapper[4802]: I1201 20:17:17.443834 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4560-account-create-update-256g2" event={"ID":"3f7b6d98-cecf-4243-b6b9-b217eb229549","Type":"ContainerDied","Data":"6340a4eb621303ea2058895ebb55c94ffff1788ceb040708bd567bc75f12f3fb"} Dec 01 20:17:17 crc kubenswrapper[4802]: I1201 20:17:17.846692 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mhtn2" Dec 01 20:17:17 crc kubenswrapper[4802]: I1201 20:17:17.979344 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6g79\" (UniqueName: \"kubernetes.io/projected/0448f2b2-4e07-4fa4-85ea-0c37f6664dcd-kube-api-access-s6g79\") pod \"0448f2b2-4e07-4fa4-85ea-0c37f6664dcd\" (UID: \"0448f2b2-4e07-4fa4-85ea-0c37f6664dcd\") " Dec 01 20:17:17 crc kubenswrapper[4802]: I1201 20:17:17.981670 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0448f2b2-4e07-4fa4-85ea-0c37f6664dcd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0448f2b2-4e07-4fa4-85ea-0c37f6664dcd" (UID: "0448f2b2-4e07-4fa4-85ea-0c37f6664dcd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:17:17 crc kubenswrapper[4802]: I1201 20:17:17.979701 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0448f2b2-4e07-4fa4-85ea-0c37f6664dcd-operator-scripts\") pod \"0448f2b2-4e07-4fa4-85ea-0c37f6664dcd\" (UID: \"0448f2b2-4e07-4fa4-85ea-0c37f6664dcd\") " Dec 01 20:17:17 crc kubenswrapper[4802]: I1201 20:17:17.982687 4802 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0448f2b2-4e07-4fa4-85ea-0c37f6664dcd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:17 crc kubenswrapper[4802]: I1201 20:17:17.987012 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0448f2b2-4e07-4fa4-85ea-0c37f6664dcd-kube-api-access-s6g79" (OuterVolumeSpecName: "kube-api-access-s6g79") pod "0448f2b2-4e07-4fa4-85ea-0c37f6664dcd" (UID: "0448f2b2-4e07-4fa4-85ea-0c37f6664dcd"). InnerVolumeSpecName "kube-api-access-s6g79". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.068156 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5c267" Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.079239 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hcph5" Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.084282 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-26c0-account-create-update-pmhgj" Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.088750 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6g79\" (UniqueName: \"kubernetes.io/projected/0448f2b2-4e07-4fa4-85ea-0c37f6664dcd-kube-api-access-s6g79\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.189825 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30c7ca3c-3346-442b-b9c8-54b8beb87e8e-operator-scripts\") pod \"30c7ca3c-3346-442b-b9c8-54b8beb87e8e\" (UID: \"30c7ca3c-3346-442b-b9c8-54b8beb87e8e\") " Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.189885 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80050873-f4d6-4468-ba9c-7f5090932b88-operator-scripts\") pod \"80050873-f4d6-4468-ba9c-7f5090932b88\" (UID: \"80050873-f4d6-4468-ba9c-7f5090932b88\") " Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.189932 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b5c1ebd-d03b-4914-a8fd-498a3cd06581-operator-scripts\") pod \"0b5c1ebd-d03b-4914-a8fd-498a3cd06581\" (UID: \"0b5c1ebd-d03b-4914-a8fd-498a3cd06581\") " Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.189982 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5xhg\" (UniqueName: \"kubernetes.io/projected/80050873-f4d6-4468-ba9c-7f5090932b88-kube-api-access-s5xhg\") pod \"80050873-f4d6-4468-ba9c-7f5090932b88\" (UID: \"80050873-f4d6-4468-ba9c-7f5090932b88\") " Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.190042 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djmpw\" (UniqueName: \"kubernetes.io/projected/0b5c1ebd-d03b-4914-a8fd-498a3cd06581-kube-api-access-djmpw\") pod \"0b5c1ebd-d03b-4914-a8fd-498a3cd06581\" (UID: \"0b5c1ebd-d03b-4914-a8fd-498a3cd06581\") " Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.190096 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qg65\" (UniqueName: \"kubernetes.io/projected/30c7ca3c-3346-442b-b9c8-54b8beb87e8e-kube-api-access-6qg65\") pod \"30c7ca3c-3346-442b-b9c8-54b8beb87e8e\" (UID: \"30c7ca3c-3346-442b-b9c8-54b8beb87e8e\") " Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.191214 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b5c1ebd-d03b-4914-a8fd-498a3cd06581-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0b5c1ebd-d03b-4914-a8fd-498a3cd06581" (UID: "0b5c1ebd-d03b-4914-a8fd-498a3cd06581"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.191559 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30c7ca3c-3346-442b-b9c8-54b8beb87e8e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "30c7ca3c-3346-442b-b9c8-54b8beb87e8e" (UID: "30c7ca3c-3346-442b-b9c8-54b8beb87e8e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.191913 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80050873-f4d6-4468-ba9c-7f5090932b88-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "80050873-f4d6-4468-ba9c-7f5090932b88" (UID: "80050873-f4d6-4468-ba9c-7f5090932b88"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.193584 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30c7ca3c-3346-442b-b9c8-54b8beb87e8e-kube-api-access-6qg65" (OuterVolumeSpecName: "kube-api-access-6qg65") pod "30c7ca3c-3346-442b-b9c8-54b8beb87e8e" (UID: "30c7ca3c-3346-442b-b9c8-54b8beb87e8e"). InnerVolumeSpecName "kube-api-access-6qg65". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.194813 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80050873-f4d6-4468-ba9c-7f5090932b88-kube-api-access-s5xhg" (OuterVolumeSpecName: "kube-api-access-s5xhg") pod "80050873-f4d6-4468-ba9c-7f5090932b88" (UID: "80050873-f4d6-4468-ba9c-7f5090932b88"). InnerVolumeSpecName "kube-api-access-s5xhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.195183 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b5c1ebd-d03b-4914-a8fd-498a3cd06581-kube-api-access-djmpw" (OuterVolumeSpecName: "kube-api-access-djmpw") pod "0b5c1ebd-d03b-4914-a8fd-498a3cd06581" (UID: "0b5c1ebd-d03b-4914-a8fd-498a3cd06581"). InnerVolumeSpecName "kube-api-access-djmpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.291811 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5xhg\" (UniqueName: \"kubernetes.io/projected/80050873-f4d6-4468-ba9c-7f5090932b88-kube-api-access-s5xhg\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.291844 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djmpw\" (UniqueName: \"kubernetes.io/projected/0b5c1ebd-d03b-4914-a8fd-498a3cd06581-kube-api-access-djmpw\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.291854 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qg65\" (UniqueName: \"kubernetes.io/projected/30c7ca3c-3346-442b-b9c8-54b8beb87e8e-kube-api-access-6qg65\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.291865 4802 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30c7ca3c-3346-442b-b9c8-54b8beb87e8e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.291874 4802 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80050873-f4d6-4468-ba9c-7f5090932b88-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.291883 4802 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b5c1ebd-d03b-4914-a8fd-498a3cd06581-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.452900 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hcph5" event={"ID":"0b5c1ebd-d03b-4914-a8fd-498a3cd06581","Type":"ContainerDied","Data":"cf67325d7be662a1031c2f11aa252690f7eaeb501c514a08b77ad9927b1b1f2b"} Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.452942 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf67325d7be662a1031c2f11aa252690f7eaeb501c514a08b77ad9927b1b1f2b" Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.452999 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hcph5" Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.459723 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-5c267" event={"ID":"30c7ca3c-3346-442b-b9c8-54b8beb87e8e","Type":"ContainerDied","Data":"9a6d9602316bb603dda48420179e2d0f52cedc8074710e6ff3ab94fc54e5085b"} Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.459785 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a6d9602316bb603dda48420179e2d0f52cedc8074710e6ff3ab94fc54e5085b" Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.459734 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5c267" Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.461289 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-mhtn2" event={"ID":"0448f2b2-4e07-4fa4-85ea-0c37f6664dcd","Type":"ContainerDied","Data":"87b6ffd6ce1cd6a272489390cd822c1d4deb2ba8f8db0a4ae79ad1d3b9f3b24f"} Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.461321 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87b6ffd6ce1cd6a272489390cd822c1d4deb2ba8f8db0a4ae79ad1d3b9f3b24f" Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.461389 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-mhtn2" Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.463743 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-26c0-account-create-update-pmhgj" Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.466415 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-26c0-account-create-update-pmhgj" event={"ID":"80050873-f4d6-4468-ba9c-7f5090932b88","Type":"ContainerDied","Data":"dc58653082051ba2f6e2a00741115c67d8fdf1d6843164bf733295b51920014d"} Dec 01 20:17:18 crc kubenswrapper[4802]: I1201 20:17:18.466466 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc58653082051ba2f6e2a00741115c67d8fdf1d6843164bf733295b51920014d" Dec 01 20:17:19 crc kubenswrapper[4802]: I1201 20:17:19.474187 4802 generic.go:334] "Generic (PLEG): container finished" podID="7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a" containerID="5a196846fc76864e8912d6493edb4c1b3c7899dd992ec1455f2b53c61650588e" exitCode=0 Dec 01 20:17:19 crc kubenswrapper[4802]: I1201 20:17:19.474251 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rsnxk" event={"ID":"7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a","Type":"ContainerDied","Data":"5a196846fc76864e8912d6493edb4c1b3c7899dd992ec1455f2b53c61650588e"} Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.498736 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4560-account-create-update-256g2" event={"ID":"3f7b6d98-cecf-4243-b6b9-b217eb229549","Type":"ContainerDied","Data":"19985ffd0e079df025988450263d4b5bfeb2eeae4de53a0f674af60a83a73fee"} Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.499050 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19985ffd0e079df025988450263d4b5bfeb2eeae4de53a0f674af60a83a73fee" Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.500645 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rsnxk" event={"ID":"7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a","Type":"ContainerDied","Data":"563ffef3101684ca1b2fa2e1a3e18c9e9a0a9947717074e5a88e38620a0b1543"} Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.500690 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="563ffef3101684ca1b2fa2e1a3e18c9e9a0a9947717074e5a88e38620a0b1543" Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.502788 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-371f-account-create-update-8gtw9" event={"ID":"4551c7b2-d51a-45d2-a645-58bd0da669c5","Type":"ContainerDied","Data":"6ed9bcf991cd99da65e1ea8bd6c67db48430ee5397a87069e60386e7ffce419f"} Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.502817 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ed9bcf991cd99da65e1ea8bd6c67db48430ee5397a87069e60386e7ffce419f" Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.529700 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-371f-account-create-update-8gtw9" Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.557134 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4560-account-create-update-256g2" Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.558357 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rsnxk" Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.670165 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgt5w\" (UniqueName: \"kubernetes.io/projected/7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a-kube-api-access-zgt5w\") pod \"7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a\" (UID: \"7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a\") " Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.670248 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fqrj\" (UniqueName: \"kubernetes.io/projected/3f7b6d98-cecf-4243-b6b9-b217eb229549-kube-api-access-9fqrj\") pod \"3f7b6d98-cecf-4243-b6b9-b217eb229549\" (UID: \"3f7b6d98-cecf-4243-b6b9-b217eb229549\") " Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.670271 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a-combined-ca-bundle\") pod \"7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a\" (UID: \"7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a\") " Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.670303 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f7b6d98-cecf-4243-b6b9-b217eb229549-operator-scripts\") pod \"3f7b6d98-cecf-4243-b6b9-b217eb229549\" (UID: \"3f7b6d98-cecf-4243-b6b9-b217eb229549\") " Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.670430 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a-config-data\") pod \"7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a\" (UID: \"7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a\") " Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.670496 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a-db-sync-config-data\") pod \"7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a\" (UID: \"7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a\") " Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.670548 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2dsv\" (UniqueName: \"kubernetes.io/projected/4551c7b2-d51a-45d2-a645-58bd0da669c5-kube-api-access-j2dsv\") pod \"4551c7b2-d51a-45d2-a645-58bd0da669c5\" (UID: \"4551c7b2-d51a-45d2-a645-58bd0da669c5\") " Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.670584 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4551c7b2-d51a-45d2-a645-58bd0da669c5-operator-scripts\") pod \"4551c7b2-d51a-45d2-a645-58bd0da669c5\" (UID: \"4551c7b2-d51a-45d2-a645-58bd0da669c5\") " Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.671376 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f7b6d98-cecf-4243-b6b9-b217eb229549-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3f7b6d98-cecf-4243-b6b9-b217eb229549" (UID: "3f7b6d98-cecf-4243-b6b9-b217eb229549"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.671487 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4551c7b2-d51a-45d2-a645-58bd0da669c5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4551c7b2-d51a-45d2-a645-58bd0da669c5" (UID: "4551c7b2-d51a-45d2-a645-58bd0da669c5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.671673 4802 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4551c7b2-d51a-45d2-a645-58bd0da669c5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.671690 4802 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f7b6d98-cecf-4243-b6b9-b217eb229549-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.675008 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4551c7b2-d51a-45d2-a645-58bd0da669c5-kube-api-access-j2dsv" (OuterVolumeSpecName: "kube-api-access-j2dsv") pod "4551c7b2-d51a-45d2-a645-58bd0da669c5" (UID: "4551c7b2-d51a-45d2-a645-58bd0da669c5"). InnerVolumeSpecName "kube-api-access-j2dsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.675069 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a-kube-api-access-zgt5w" (OuterVolumeSpecName: "kube-api-access-zgt5w") pod "7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a" (UID: "7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a"). InnerVolumeSpecName "kube-api-access-zgt5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.675507 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f7b6d98-cecf-4243-b6b9-b217eb229549-kube-api-access-9fqrj" (OuterVolumeSpecName: "kube-api-access-9fqrj") pod "3f7b6d98-cecf-4243-b6b9-b217eb229549" (UID: "3f7b6d98-cecf-4243-b6b9-b217eb229549"). InnerVolumeSpecName "kube-api-access-9fqrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.677558 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a" (UID: "7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.690658 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a" (UID: "7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.711525 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a-config-data" (OuterVolumeSpecName: "config-data") pod "7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a" (UID: "7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.773813 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.773842 4802 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.773854 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2dsv\" (UniqueName: \"kubernetes.io/projected/4551c7b2-d51a-45d2-a645-58bd0da669c5-kube-api-access-j2dsv\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.773879 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgt5w\" (UniqueName: \"kubernetes.io/projected/7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a-kube-api-access-zgt5w\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.773888 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fqrj\" (UniqueName: \"kubernetes.io/projected/3f7b6d98-cecf-4243-b6b9-b217eb229549-kube-api-access-9fqrj\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:21 crc kubenswrapper[4802]: I1201 20:17:21.773896 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:22 crc kubenswrapper[4802]: I1201 20:17:22.511097 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-v6674" event={"ID":"8852056c-2330-4f2b-acd3-89442f05e8c9","Type":"ContainerStarted","Data":"a4ad1aef6c49ef0c311a9993a4f30b0878106ef7dd57c60eb5a085e2caf3edba"} Dec 01 20:17:22 crc kubenswrapper[4802]: I1201 20:17:22.511152 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4560-account-create-update-256g2" Dec 01 20:17:22 crc kubenswrapper[4802]: I1201 20:17:22.511121 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rsnxk" Dec 01 20:17:22 crc kubenswrapper[4802]: I1201 20:17:22.511191 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-371f-account-create-update-8gtw9" Dec 01 20:17:22 crc kubenswrapper[4802]: I1201 20:17:22.539867 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-v6674" podStartSLOduration=2.938556262 podStartE2EDuration="8.539853263s" podCreationTimestamp="2025-12-01 20:17:14 +0000 UTC" firstStartedPulling="2025-12-01 20:17:15.808839023 +0000 UTC m=+1257.371398664" lastFinishedPulling="2025-12-01 20:17:21.410136024 +0000 UTC m=+1262.972695665" observedRunningTime="2025-12-01 20:17:22.538363287 +0000 UTC m=+1264.100922918" watchObservedRunningTime="2025-12-01 20:17:22.539853263 +0000 UTC m=+1264.102412904" Dec 01 20:17:22 crc kubenswrapper[4802]: I1201 20:17:22.916627 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-j4dg5"] Dec 01 20:17:22 crc kubenswrapper[4802]: E1201 20:17:22.917348 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b5c1ebd-d03b-4914-a8fd-498a3cd06581" containerName="mariadb-database-create" Dec 01 20:17:22 crc kubenswrapper[4802]: I1201 20:17:22.917369 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b5c1ebd-d03b-4914-a8fd-498a3cd06581" containerName="mariadb-database-create" Dec 01 20:17:22 crc kubenswrapper[4802]: E1201 20:17:22.917383 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0448f2b2-4e07-4fa4-85ea-0c37f6664dcd" containerName="mariadb-database-create" Dec 01 20:17:22 crc kubenswrapper[4802]: I1201 20:17:22.917393 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="0448f2b2-4e07-4fa4-85ea-0c37f6664dcd" containerName="mariadb-database-create" Dec 01 20:17:22 crc kubenswrapper[4802]: E1201 20:17:22.917409 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4551c7b2-d51a-45d2-a645-58bd0da669c5" containerName="mariadb-account-create-update" Dec 01 20:17:22 crc kubenswrapper[4802]: I1201 20:17:22.917417 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="4551c7b2-d51a-45d2-a645-58bd0da669c5" containerName="mariadb-account-create-update" Dec 01 20:17:22 crc kubenswrapper[4802]: E1201 20:17:22.917432 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30c7ca3c-3346-442b-b9c8-54b8beb87e8e" containerName="mariadb-database-create" Dec 01 20:17:22 crc kubenswrapper[4802]: I1201 20:17:22.917439 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="30c7ca3c-3346-442b-b9c8-54b8beb87e8e" containerName="mariadb-database-create" Dec 01 20:17:22 crc kubenswrapper[4802]: E1201 20:17:22.917467 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80050873-f4d6-4468-ba9c-7f5090932b88" containerName="mariadb-account-create-update" Dec 01 20:17:22 crc kubenswrapper[4802]: I1201 20:17:22.917474 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="80050873-f4d6-4468-ba9c-7f5090932b88" containerName="mariadb-account-create-update" Dec 01 20:17:22 crc kubenswrapper[4802]: E1201 20:17:22.917487 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f7b6d98-cecf-4243-b6b9-b217eb229549" containerName="mariadb-account-create-update" Dec 01 20:17:22 crc kubenswrapper[4802]: I1201 20:17:22.917494 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f7b6d98-cecf-4243-b6b9-b217eb229549" containerName="mariadb-account-create-update" Dec 01 20:17:22 crc kubenswrapper[4802]: E1201 20:17:22.917505 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a" containerName="glance-db-sync" Dec 01 20:17:22 crc kubenswrapper[4802]: I1201 20:17:22.917513 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a" containerName="glance-db-sync" Dec 01 20:17:22 crc kubenswrapper[4802]: I1201 20:17:22.917718 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="4551c7b2-d51a-45d2-a645-58bd0da669c5" containerName="mariadb-account-create-update" Dec 01 20:17:22 crc kubenswrapper[4802]: I1201 20:17:22.917738 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f7b6d98-cecf-4243-b6b9-b217eb229549" containerName="mariadb-account-create-update" Dec 01 20:17:22 crc kubenswrapper[4802]: I1201 20:17:22.917760 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="80050873-f4d6-4468-ba9c-7f5090932b88" containerName="mariadb-account-create-update" Dec 01 20:17:22 crc kubenswrapper[4802]: I1201 20:17:22.917775 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b5c1ebd-d03b-4914-a8fd-498a3cd06581" containerName="mariadb-database-create" Dec 01 20:17:22 crc kubenswrapper[4802]: I1201 20:17:22.917801 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="30c7ca3c-3346-442b-b9c8-54b8beb87e8e" containerName="mariadb-database-create" Dec 01 20:17:22 crc kubenswrapper[4802]: I1201 20:17:22.917824 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="0448f2b2-4e07-4fa4-85ea-0c37f6664dcd" containerName="mariadb-database-create" Dec 01 20:17:22 crc kubenswrapper[4802]: I1201 20:17:22.917847 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a" containerName="glance-db-sync" Dec 01 20:17:22 crc kubenswrapper[4802]: I1201 20:17:22.918926 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-j4dg5" Dec 01 20:17:22 crc kubenswrapper[4802]: I1201 20:17:22.957250 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-j4dg5"] Dec 01 20:17:22 crc kubenswrapper[4802]: I1201 20:17:22.991667 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a30a5486-cc7c-44be-bad6-5491a8003956-config\") pod \"dnsmasq-dns-54f9b7b8d9-j4dg5\" (UID: \"a30a5486-cc7c-44be-bad6-5491a8003956\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-j4dg5" Dec 01 20:17:22 crc kubenswrapper[4802]: I1201 20:17:22.991730 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a30a5486-cc7c-44be-bad6-5491a8003956-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-j4dg5\" (UID: \"a30a5486-cc7c-44be-bad6-5491a8003956\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-j4dg5" Dec 01 20:17:22 crc kubenswrapper[4802]: I1201 20:17:22.991762 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsd4x\" (UniqueName: \"kubernetes.io/projected/a30a5486-cc7c-44be-bad6-5491a8003956-kube-api-access-tsd4x\") pod \"dnsmasq-dns-54f9b7b8d9-j4dg5\" (UID: \"a30a5486-cc7c-44be-bad6-5491a8003956\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-j4dg5" Dec 01 20:17:22 crc kubenswrapper[4802]: I1201 20:17:22.992039 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a30a5486-cc7c-44be-bad6-5491a8003956-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-j4dg5\" (UID: \"a30a5486-cc7c-44be-bad6-5491a8003956\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-j4dg5" Dec 01 20:17:22 crc kubenswrapper[4802]: I1201 20:17:22.992234 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a30a5486-cc7c-44be-bad6-5491a8003956-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-j4dg5\" (UID: \"a30a5486-cc7c-44be-bad6-5491a8003956\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-j4dg5" Dec 01 20:17:23 crc kubenswrapper[4802]: I1201 20:17:23.093760 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a30a5486-cc7c-44be-bad6-5491a8003956-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-j4dg5\" (UID: \"a30a5486-cc7c-44be-bad6-5491a8003956\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-j4dg5" Dec 01 20:17:23 crc kubenswrapper[4802]: I1201 20:17:23.093821 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a30a5486-cc7c-44be-bad6-5491a8003956-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-j4dg5\" (UID: \"a30a5486-cc7c-44be-bad6-5491a8003956\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-j4dg5" Dec 01 20:17:23 crc kubenswrapper[4802]: I1201 20:17:23.093868 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a30a5486-cc7c-44be-bad6-5491a8003956-config\") pod \"dnsmasq-dns-54f9b7b8d9-j4dg5\" (UID: \"a30a5486-cc7c-44be-bad6-5491a8003956\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-j4dg5" Dec 01 20:17:23 crc kubenswrapper[4802]: I1201 20:17:23.093888 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a30a5486-cc7c-44be-bad6-5491a8003956-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-j4dg5\" (UID: \"a30a5486-cc7c-44be-bad6-5491a8003956\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-j4dg5" Dec 01 20:17:23 crc kubenswrapper[4802]: I1201 20:17:23.093908 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsd4x\" (UniqueName: \"kubernetes.io/projected/a30a5486-cc7c-44be-bad6-5491a8003956-kube-api-access-tsd4x\") pod \"dnsmasq-dns-54f9b7b8d9-j4dg5\" (UID: \"a30a5486-cc7c-44be-bad6-5491a8003956\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-j4dg5" Dec 01 20:17:23 crc kubenswrapper[4802]: I1201 20:17:23.094721 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a30a5486-cc7c-44be-bad6-5491a8003956-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-j4dg5\" (UID: \"a30a5486-cc7c-44be-bad6-5491a8003956\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-j4dg5" Dec 01 20:17:23 crc kubenswrapper[4802]: I1201 20:17:23.094737 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a30a5486-cc7c-44be-bad6-5491a8003956-config\") pod \"dnsmasq-dns-54f9b7b8d9-j4dg5\" (UID: \"a30a5486-cc7c-44be-bad6-5491a8003956\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-j4dg5" Dec 01 20:17:23 crc kubenswrapper[4802]: I1201 20:17:23.094943 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a30a5486-cc7c-44be-bad6-5491a8003956-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-j4dg5\" (UID: \"a30a5486-cc7c-44be-bad6-5491a8003956\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-j4dg5" Dec 01 20:17:23 crc kubenswrapper[4802]: I1201 20:17:23.095351 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a30a5486-cc7c-44be-bad6-5491a8003956-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-j4dg5\" (UID: \"a30a5486-cc7c-44be-bad6-5491a8003956\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-j4dg5" Dec 01 20:17:23 crc kubenswrapper[4802]: I1201 20:17:23.114077 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsd4x\" (UniqueName: \"kubernetes.io/projected/a30a5486-cc7c-44be-bad6-5491a8003956-kube-api-access-tsd4x\") pod \"dnsmasq-dns-54f9b7b8d9-j4dg5\" (UID: \"a30a5486-cc7c-44be-bad6-5491a8003956\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-j4dg5" Dec 01 20:17:23 crc kubenswrapper[4802]: I1201 20:17:23.253816 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-j4dg5" Dec 01 20:17:23 crc kubenswrapper[4802]: I1201 20:17:23.798689 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-j4dg5"] Dec 01 20:17:24 crc kubenswrapper[4802]: I1201 20:17:24.539788 4802 generic.go:334] "Generic (PLEG): container finished" podID="a30a5486-cc7c-44be-bad6-5491a8003956" containerID="2a7331c3d03d6429a7a416dd9b872f5bdf6549452f33926d0bb69d137cabb1bc" exitCode=0 Dec 01 20:17:24 crc kubenswrapper[4802]: I1201 20:17:24.539829 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-j4dg5" event={"ID":"a30a5486-cc7c-44be-bad6-5491a8003956","Type":"ContainerDied","Data":"2a7331c3d03d6429a7a416dd9b872f5bdf6549452f33926d0bb69d137cabb1bc"} Dec 01 20:17:24 crc kubenswrapper[4802]: I1201 20:17:24.540138 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-j4dg5" event={"ID":"a30a5486-cc7c-44be-bad6-5491a8003956","Type":"ContainerStarted","Data":"51fb48e49473ef4d1f22dc09812eab250fc21be0797126552b2b5340d630aa77"} Dec 01 20:17:25 crc kubenswrapper[4802]: I1201 20:17:25.549717 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-j4dg5" event={"ID":"a30a5486-cc7c-44be-bad6-5491a8003956","Type":"ContainerStarted","Data":"821e8dde9993f85285de60639d12ed6876ac696e570eb017f3ecd3349d423dd3"} Dec 01 20:17:25 crc kubenswrapper[4802]: I1201 20:17:25.550104 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54f9b7b8d9-j4dg5" Dec 01 20:17:25 crc kubenswrapper[4802]: I1201 20:17:25.551289 4802 generic.go:334] "Generic (PLEG): container finished" podID="8852056c-2330-4f2b-acd3-89442f05e8c9" containerID="a4ad1aef6c49ef0c311a9993a4f30b0878106ef7dd57c60eb5a085e2caf3edba" exitCode=0 Dec 01 20:17:25 crc kubenswrapper[4802]: I1201 20:17:25.551332 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-v6674" event={"ID":"8852056c-2330-4f2b-acd3-89442f05e8c9","Type":"ContainerDied","Data":"a4ad1aef6c49ef0c311a9993a4f30b0878106ef7dd57c60eb5a085e2caf3edba"} Dec 01 20:17:25 crc kubenswrapper[4802]: I1201 20:17:25.574019 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54f9b7b8d9-j4dg5" podStartSLOduration=3.574002395 podStartE2EDuration="3.574002395s" podCreationTimestamp="2025-12-01 20:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:17:25.56782259 +0000 UTC m=+1267.130382231" watchObservedRunningTime="2025-12-01 20:17:25.574002395 +0000 UTC m=+1267.136562036" Dec 01 20:17:26 crc kubenswrapper[4802]: I1201 20:17:26.953365 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-v6674" Dec 01 20:17:27 crc kubenswrapper[4802]: I1201 20:17:27.063659 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8852056c-2330-4f2b-acd3-89442f05e8c9-config-data\") pod \"8852056c-2330-4f2b-acd3-89442f05e8c9\" (UID: \"8852056c-2330-4f2b-acd3-89442f05e8c9\") " Dec 01 20:17:27 crc kubenswrapper[4802]: I1201 20:17:27.064501 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8852056c-2330-4f2b-acd3-89442f05e8c9-combined-ca-bundle\") pod \"8852056c-2330-4f2b-acd3-89442f05e8c9\" (UID: \"8852056c-2330-4f2b-acd3-89442f05e8c9\") " Dec 01 20:17:27 crc kubenswrapper[4802]: I1201 20:17:27.064978 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw2dx\" (UniqueName: \"kubernetes.io/projected/8852056c-2330-4f2b-acd3-89442f05e8c9-kube-api-access-fw2dx\") pod \"8852056c-2330-4f2b-acd3-89442f05e8c9\" (UID: \"8852056c-2330-4f2b-acd3-89442f05e8c9\") " Dec 01 20:17:27 crc kubenswrapper[4802]: I1201 20:17:27.071795 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8852056c-2330-4f2b-acd3-89442f05e8c9-kube-api-access-fw2dx" (OuterVolumeSpecName: "kube-api-access-fw2dx") pod "8852056c-2330-4f2b-acd3-89442f05e8c9" (UID: "8852056c-2330-4f2b-acd3-89442f05e8c9"). InnerVolumeSpecName "kube-api-access-fw2dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:17:27 crc kubenswrapper[4802]: I1201 20:17:27.094338 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8852056c-2330-4f2b-acd3-89442f05e8c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8852056c-2330-4f2b-acd3-89442f05e8c9" (UID: "8852056c-2330-4f2b-acd3-89442f05e8c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:17:27 crc kubenswrapper[4802]: I1201 20:17:27.129425 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8852056c-2330-4f2b-acd3-89442f05e8c9-config-data" (OuterVolumeSpecName: "config-data") pod "8852056c-2330-4f2b-acd3-89442f05e8c9" (UID: "8852056c-2330-4f2b-acd3-89442f05e8c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:17:27 crc kubenswrapper[4802]: I1201 20:17:27.167551 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8852056c-2330-4f2b-acd3-89442f05e8c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:27 crc kubenswrapper[4802]: I1201 20:17:27.167745 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw2dx\" (UniqueName: \"kubernetes.io/projected/8852056c-2330-4f2b-acd3-89442f05e8c9-kube-api-access-fw2dx\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:27 crc kubenswrapper[4802]: I1201 20:17:27.167761 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8852056c-2330-4f2b-acd3-89442f05e8c9-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:27 crc kubenswrapper[4802]: I1201 20:17:27.567740 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-v6674" event={"ID":"8852056c-2330-4f2b-acd3-89442f05e8c9","Type":"ContainerDied","Data":"a2196d0982079ca76fa8ae17b84c9c4fb9c4d1014d4bb376d747098ee66259e0"} Dec 01 20:17:27 crc kubenswrapper[4802]: I1201 20:17:27.567806 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2196d0982079ca76fa8ae17b84c9c4fb9c4d1014d4bb376d747098ee66259e0" Dec 01 20:17:27 crc kubenswrapper[4802]: I1201 20:17:27.567825 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-v6674" Dec 01 20:17:27 crc kubenswrapper[4802]: I1201 20:17:27.853104 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-j4dg5"] Dec 01 20:17:27 crc kubenswrapper[4802]: I1201 20:17:27.853459 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54f9b7b8d9-j4dg5" podUID="a30a5486-cc7c-44be-bad6-5491a8003956" containerName="dnsmasq-dns" containerID="cri-o://821e8dde9993f85285de60639d12ed6876ac696e570eb017f3ecd3349d423dd3" gracePeriod=10 Dec 01 20:17:27 crc kubenswrapper[4802]: I1201 20:17:27.909459 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-svwp7"] Dec 01 20:17:27 crc kubenswrapper[4802]: E1201 20:17:27.909887 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8852056c-2330-4f2b-acd3-89442f05e8c9" containerName="keystone-db-sync" Dec 01 20:17:27 crc kubenswrapper[4802]: I1201 20:17:27.909907 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="8852056c-2330-4f2b-acd3-89442f05e8c9" containerName="keystone-db-sync" Dec 01 20:17:27 crc kubenswrapper[4802]: I1201 20:17:27.917757 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="8852056c-2330-4f2b-acd3-89442f05e8c9" containerName="keystone-db-sync" Dec 01 20:17:27 crc kubenswrapper[4802]: I1201 20:17:27.919140 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-svwp7" Dec 01 20:17:27 crc kubenswrapper[4802]: I1201 20:17:27.927225 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-sc66v"] Dec 01 20:17:27 crc kubenswrapper[4802]: I1201 20:17:27.928658 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sc66v" Dec 01 20:17:27 crc kubenswrapper[4802]: I1201 20:17:27.934146 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 01 20:17:27 crc kubenswrapper[4802]: I1201 20:17:27.934407 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 20:17:27 crc kubenswrapper[4802]: I1201 20:17:27.934977 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 20:17:27 crc kubenswrapper[4802]: I1201 20:17:27.935137 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 20:17:27 crc kubenswrapper[4802]: I1201 20:17:27.935216 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jd99d" Dec 01 20:17:27 crc kubenswrapper[4802]: I1201 20:17:27.955914 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-svwp7"] Dec 01 20:17:27 crc kubenswrapper[4802]: I1201 20:17:27.968521 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-sc66v"] Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.087681 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-config-data\") pod \"keystone-bootstrap-sc66v\" (UID: \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\") " pod="openstack/keystone-bootstrap-sc66v" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.088314 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ffgz\" (UniqueName: \"kubernetes.io/projected/63a96c12-9250-4882-9fb1-3de6abe702f3-kube-api-access-2ffgz\") pod \"dnsmasq-dns-6546db6db7-svwp7\" (UID: \"63a96c12-9250-4882-9fb1-3de6abe702f3\") " pod="openstack/dnsmasq-dns-6546db6db7-svwp7" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.088486 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-scripts\") pod \"keystone-bootstrap-sc66v\" (UID: \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\") " pod="openstack/keystone-bootstrap-sc66v" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.088596 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-credential-keys\") pod \"keystone-bootstrap-sc66v\" (UID: \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\") " pod="openstack/keystone-bootstrap-sc66v" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.088753 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63a96c12-9250-4882-9fb1-3de6abe702f3-dns-svc\") pod \"dnsmasq-dns-6546db6db7-svwp7\" (UID: \"63a96c12-9250-4882-9fb1-3de6abe702f3\") " pod="openstack/dnsmasq-dns-6546db6db7-svwp7" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.088877 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/63a96c12-9250-4882-9fb1-3de6abe702f3-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-svwp7\" (UID: \"63a96c12-9250-4882-9fb1-3de6abe702f3\") " pod="openstack/dnsmasq-dns-6546db6db7-svwp7" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.089026 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-fernet-keys\") pod \"keystone-bootstrap-sc66v\" (UID: \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\") " pod="openstack/keystone-bootstrap-sc66v" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.089141 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc57w\" (UniqueName: \"kubernetes.io/projected/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-kube-api-access-xc57w\") pod \"keystone-bootstrap-sc66v\" (UID: \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\") " pod="openstack/keystone-bootstrap-sc66v" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.089397 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63a96c12-9250-4882-9fb1-3de6abe702f3-config\") pod \"dnsmasq-dns-6546db6db7-svwp7\" (UID: \"63a96c12-9250-4882-9fb1-3de6abe702f3\") " pod="openstack/dnsmasq-dns-6546db6db7-svwp7" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.089454 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63a96c12-9250-4882-9fb1-3de6abe702f3-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-svwp7\" (UID: \"63a96c12-9250-4882-9fb1-3de6abe702f3\") " pod="openstack/dnsmasq-dns-6546db6db7-svwp7" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.089497 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-combined-ca-bundle\") pod \"keystone-bootstrap-sc66v\" (UID: \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\") " pod="openstack/keystone-bootstrap-sc66v" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.184757 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-p259k"] Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.185935 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-p259k" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.190515 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.190798 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-296w6" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.190934 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.191771 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-config-data\") pod \"keystone-bootstrap-sc66v\" (UID: \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\") " pod="openstack/keystone-bootstrap-sc66v" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.191805 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ffgz\" (UniqueName: \"kubernetes.io/projected/63a96c12-9250-4882-9fb1-3de6abe702f3-kube-api-access-2ffgz\") pod \"dnsmasq-dns-6546db6db7-svwp7\" (UID: \"63a96c12-9250-4882-9fb1-3de6abe702f3\") " pod="openstack/dnsmasq-dns-6546db6db7-svwp7" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.191833 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-scripts\") pod \"keystone-bootstrap-sc66v\" (UID: \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\") " pod="openstack/keystone-bootstrap-sc66v" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.191856 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-credential-keys\") pod \"keystone-bootstrap-sc66v\" (UID: \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\") " pod="openstack/keystone-bootstrap-sc66v" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.191898 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63a96c12-9250-4882-9fb1-3de6abe702f3-dns-svc\") pod \"dnsmasq-dns-6546db6db7-svwp7\" (UID: \"63a96c12-9250-4882-9fb1-3de6abe702f3\") " pod="openstack/dnsmasq-dns-6546db6db7-svwp7" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.191923 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/63a96c12-9250-4882-9fb1-3de6abe702f3-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-svwp7\" (UID: \"63a96c12-9250-4882-9fb1-3de6abe702f3\") " pod="openstack/dnsmasq-dns-6546db6db7-svwp7" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.191963 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-fernet-keys\") pod \"keystone-bootstrap-sc66v\" (UID: \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\") " pod="openstack/keystone-bootstrap-sc66v" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.191989 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc57w\" (UniqueName: \"kubernetes.io/projected/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-kube-api-access-xc57w\") pod \"keystone-bootstrap-sc66v\" (UID: \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\") " pod="openstack/keystone-bootstrap-sc66v" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.192027 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63a96c12-9250-4882-9fb1-3de6abe702f3-config\") pod \"dnsmasq-dns-6546db6db7-svwp7\" (UID: \"63a96c12-9250-4882-9fb1-3de6abe702f3\") " pod="openstack/dnsmasq-dns-6546db6db7-svwp7" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.192057 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63a96c12-9250-4882-9fb1-3de6abe702f3-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-svwp7\" (UID: \"63a96c12-9250-4882-9fb1-3de6abe702f3\") " pod="openstack/dnsmasq-dns-6546db6db7-svwp7" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.192087 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-combined-ca-bundle\") pod \"keystone-bootstrap-sc66v\" (UID: \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\") " pod="openstack/keystone-bootstrap-sc66v" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.194034 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63a96c12-9250-4882-9fb1-3de6abe702f3-dns-svc\") pod \"dnsmasq-dns-6546db6db7-svwp7\" (UID: \"63a96c12-9250-4882-9fb1-3de6abe702f3\") " pod="openstack/dnsmasq-dns-6546db6db7-svwp7" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.194967 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63a96c12-9250-4882-9fb1-3de6abe702f3-config\") pod \"dnsmasq-dns-6546db6db7-svwp7\" (UID: \"63a96c12-9250-4882-9fb1-3de6abe702f3\") " pod="openstack/dnsmasq-dns-6546db6db7-svwp7" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.195679 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/63a96c12-9250-4882-9fb1-3de6abe702f3-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-svwp7\" (UID: \"63a96c12-9250-4882-9fb1-3de6abe702f3\") " pod="openstack/dnsmasq-dns-6546db6db7-svwp7" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.202835 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63a96c12-9250-4882-9fb1-3de6abe702f3-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-svwp7\" (UID: \"63a96c12-9250-4882-9fb1-3de6abe702f3\") " pod="openstack/dnsmasq-dns-6546db6db7-svwp7" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.211643 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-combined-ca-bundle\") pod \"keystone-bootstrap-sc66v\" (UID: \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\") " pod="openstack/keystone-bootstrap-sc66v" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.217468 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-scripts\") pod \"keystone-bootstrap-sc66v\" (UID: \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\") " pod="openstack/keystone-bootstrap-sc66v" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.218285 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-fernet-keys\") pod \"keystone-bootstrap-sc66v\" (UID: \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\") " pod="openstack/keystone-bootstrap-sc66v" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.222255 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.222903 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-config-data\") pod \"keystone-bootstrap-sc66v\" (UID: \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\") " pod="openstack/keystone-bootstrap-sc66v" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.228484 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.232589 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.236005 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.236147 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-p259k"] Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.236874 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-credential-keys\") pod \"keystone-bootstrap-sc66v\" (UID: \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\") " pod="openstack/keystone-bootstrap-sc66v" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.264977 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc57w\" (UniqueName: \"kubernetes.io/projected/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-kube-api-access-xc57w\") pod \"keystone-bootstrap-sc66v\" (UID: \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\") " pod="openstack/keystone-bootstrap-sc66v" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.274268 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.303409 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ffgz\" (UniqueName: \"kubernetes.io/projected/63a96c12-9250-4882-9fb1-3de6abe702f3-kube-api-access-2ffgz\") pod \"dnsmasq-dns-6546db6db7-svwp7\" (UID: \"63a96c12-9250-4882-9fb1-3de6abe702f3\") " pod="openstack/dnsmasq-dns-6546db6db7-svwp7" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.303875 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sc66v" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.304905 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/93f6c930-0ed7-480e-8725-692427ba2b9d-etc-machine-id\") pod \"cinder-db-sync-p259k\" (UID: \"93f6c930-0ed7-480e-8725-692427ba2b9d\") " pod="openstack/cinder-db-sync-p259k" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.304995 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/93f6c930-0ed7-480e-8725-692427ba2b9d-db-sync-config-data\") pod \"cinder-db-sync-p259k\" (UID: \"93f6c930-0ed7-480e-8725-692427ba2b9d\") " pod="openstack/cinder-db-sync-p259k" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.305056 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f6c930-0ed7-480e-8725-692427ba2b9d-config-data\") pod \"cinder-db-sync-p259k\" (UID: \"93f6c930-0ed7-480e-8725-692427ba2b9d\") " pod="openstack/cinder-db-sync-p259k" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.305153 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f6c930-0ed7-480e-8725-692427ba2b9d-combined-ca-bundle\") pod \"cinder-db-sync-p259k\" (UID: \"93f6c930-0ed7-480e-8725-692427ba2b9d\") " pod="openstack/cinder-db-sync-p259k" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.305177 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gzzp\" (UniqueName: \"kubernetes.io/projected/93f6c930-0ed7-480e-8725-692427ba2b9d-kube-api-access-5gzzp\") pod \"cinder-db-sync-p259k\" (UID: \"93f6c930-0ed7-480e-8725-692427ba2b9d\") " pod="openstack/cinder-db-sync-p259k" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.305226 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93f6c930-0ed7-480e-8725-692427ba2b9d-scripts\") pod \"cinder-db-sync-p259k\" (UID: \"93f6c930-0ed7-480e-8725-692427ba2b9d\") " pod="openstack/cinder-db-sync-p259k" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.342909 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-xcrnm"] Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.344502 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-xcrnm" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.347240 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.352356 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-7v6r6" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.369271 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-xcrnm"] Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.370092 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.373461 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-svwp7"] Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.374170 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-svwp7" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.421505 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-xkpk5"] Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.419994 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/93f6c930-0ed7-480e-8725-692427ba2b9d-db-sync-config-data\") pod \"cinder-db-sync-p259k\" (UID: \"93f6c930-0ed7-480e-8725-692427ba2b9d\") " pod="openstack/cinder-db-sync-p259k" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.422458 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa8400d-0d21-46be-8bb9-3dd4996d8500-config-data\") pod \"ceilometer-0\" (UID: \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\") " pod="openstack/ceilometer-0" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.432340 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f6c930-0ed7-480e-8725-692427ba2b9d-config-data\") pod \"cinder-db-sync-p259k\" (UID: \"93f6c930-0ed7-480e-8725-692427ba2b9d\") " pod="openstack/cinder-db-sync-p259k" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.432473 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1aa8400d-0d21-46be-8bb9-3dd4996d8500-run-httpd\") pod \"ceilometer-0\" (UID: \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\") " pod="openstack/ceilometer-0" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.432677 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aa8400d-0d21-46be-8bb9-3dd4996d8500-scripts\") pod \"ceilometer-0\" (UID: \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\") " pod="openstack/ceilometer-0" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.432818 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1aa8400d-0d21-46be-8bb9-3dd4996d8500-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\") " pod="openstack/ceilometer-0" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.432965 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f6c930-0ed7-480e-8725-692427ba2b9d-combined-ca-bundle\") pod \"cinder-db-sync-p259k\" (UID: \"93f6c930-0ed7-480e-8725-692427ba2b9d\") " pod="openstack/cinder-db-sync-p259k" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.433074 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gzzp\" (UniqueName: \"kubernetes.io/projected/93f6c930-0ed7-480e-8725-692427ba2b9d-kube-api-access-5gzzp\") pod \"cinder-db-sync-p259k\" (UID: \"93f6c930-0ed7-480e-8725-692427ba2b9d\") " pod="openstack/cinder-db-sync-p259k" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.434213 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93f6c930-0ed7-480e-8725-692427ba2b9d-scripts\") pod \"cinder-db-sync-p259k\" (UID: \"93f6c930-0ed7-480e-8725-692427ba2b9d\") " pod="openstack/cinder-db-sync-p259k" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.434447 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/93f6c930-0ed7-480e-8725-692427ba2b9d-etc-machine-id\") pod \"cinder-db-sync-p259k\" (UID: \"93f6c930-0ed7-480e-8725-692427ba2b9d\") " pod="openstack/cinder-db-sync-p259k" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.434549 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxgpv\" (UniqueName: \"kubernetes.io/projected/1aa8400d-0d21-46be-8bb9-3dd4996d8500-kube-api-access-dxgpv\") pod \"ceilometer-0\" (UID: \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\") " pod="openstack/ceilometer-0" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.434634 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1aa8400d-0d21-46be-8bb9-3dd4996d8500-log-httpd\") pod \"ceilometer-0\" (UID: \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\") " pod="openstack/ceilometer-0" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.434765 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa8400d-0d21-46be-8bb9-3dd4996d8500-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\") " pod="openstack/ceilometer-0" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.437066 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/93f6c930-0ed7-480e-8725-692427ba2b9d-etc-machine-id\") pod \"cinder-db-sync-p259k\" (UID: \"93f6c930-0ed7-480e-8725-692427ba2b9d\") " pod="openstack/cinder-db-sync-p259k" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.422942 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xkpk5" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.431526 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/93f6c930-0ed7-480e-8725-692427ba2b9d-db-sync-config-data\") pod \"cinder-db-sync-p259k\" (UID: \"93f6c930-0ed7-480e-8725-692427ba2b9d\") " pod="openstack/cinder-db-sync-p259k" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.442257 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f6c930-0ed7-480e-8725-692427ba2b9d-combined-ca-bundle\") pod \"cinder-db-sync-p259k\" (UID: \"93f6c930-0ed7-480e-8725-692427ba2b9d\") " pod="openstack/cinder-db-sync-p259k" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.443635 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f6c930-0ed7-480e-8725-692427ba2b9d-config-data\") pod \"cinder-db-sync-p259k\" (UID: \"93f6c930-0ed7-480e-8725-692427ba2b9d\") " pod="openstack/cinder-db-sync-p259k" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.450833 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-xkpk5"] Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.452715 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.452922 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93f6c930-0ed7-480e-8725-692427ba2b9d-scripts\") pod \"cinder-db-sync-p259k\" (UID: \"93f6c930-0ed7-480e-8725-692427ba2b9d\") " pod="openstack/cinder-db-sync-p259k" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.452957 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ktp48" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.453075 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.466112 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gzzp\" (UniqueName: \"kubernetes.io/projected/93f6c930-0ed7-480e-8725-692427ba2b9d-kube-api-access-5gzzp\") pod \"cinder-db-sync-p259k\" (UID: \"93f6c930-0ed7-480e-8725-692427ba2b9d\") " pod="openstack/cinder-db-sync-p259k" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.489273 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-26ssz"] Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.490972 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-26ssz" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.527535 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-26ssz"] Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.540608 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-hkl2c"] Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.543349 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hkl2c" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.545214 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f0b1428-3468-4e47-939d-8614d302bd75-config\") pod \"neutron-db-sync-xcrnm\" (UID: \"1f0b1428-3468-4e47-939d-8614d302bd75\") " pod="openstack/neutron-db-sync-xcrnm" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.545304 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b1801b5-50f2-40fc-9e14-386216a4418c-scripts\") pod \"placement-db-sync-xkpk5\" (UID: \"8b1801b5-50f2-40fc-9e14-386216a4418c\") " pod="openstack/placement-db-sync-xkpk5" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.545335 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm2sl\" (UniqueName: \"kubernetes.io/projected/8b1801b5-50f2-40fc-9e14-386216a4418c-kube-api-access-nm2sl\") pod \"placement-db-sync-xkpk5\" (UID: \"8b1801b5-50f2-40fc-9e14-386216a4418c\") " pod="openstack/placement-db-sync-xkpk5" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.545370 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa8400d-0d21-46be-8bb9-3dd4996d8500-config-data\") pod \"ceilometer-0\" (UID: \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\") " pod="openstack/ceilometer-0" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.545395 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-26ssz\" (UID: \"6aac0af2-647b-4f69-b07a-96bdd62c6e3d\") " pod="openstack/dnsmasq-dns-7987f74bbc-26ssz" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.545443 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1aa8400d-0d21-46be-8bb9-3dd4996d8500-run-httpd\") pod \"ceilometer-0\" (UID: \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\") " pod="openstack/ceilometer-0" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.545478 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aa8400d-0d21-46be-8bb9-3dd4996d8500-scripts\") pod \"ceilometer-0\" (UID: \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\") " pod="openstack/ceilometer-0" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.545508 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh7zl\" (UniqueName: \"kubernetes.io/projected/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-kube-api-access-qh7zl\") pod \"dnsmasq-dns-7987f74bbc-26ssz\" (UID: \"6aac0af2-647b-4f69-b07a-96bdd62c6e3d\") " pod="openstack/dnsmasq-dns-7987f74bbc-26ssz" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.545539 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b1801b5-50f2-40fc-9e14-386216a4418c-combined-ca-bundle\") pod \"placement-db-sync-xkpk5\" (UID: \"8b1801b5-50f2-40fc-9e14-386216a4418c\") " pod="openstack/placement-db-sync-xkpk5" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.545571 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1aa8400d-0d21-46be-8bb9-3dd4996d8500-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\") " pod="openstack/ceilometer-0" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.545615 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b1801b5-50f2-40fc-9e14-386216a4418c-config-data\") pod \"placement-db-sync-xkpk5\" (UID: \"8b1801b5-50f2-40fc-9e14-386216a4418c\") " pod="openstack/placement-db-sync-xkpk5" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.545658 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b1801b5-50f2-40fc-9e14-386216a4418c-logs\") pod \"placement-db-sync-xkpk5\" (UID: \"8b1801b5-50f2-40fc-9e14-386216a4418c\") " pod="openstack/placement-db-sync-xkpk5" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.545692 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxgpv\" (UniqueName: \"kubernetes.io/projected/1aa8400d-0d21-46be-8bb9-3dd4996d8500-kube-api-access-dxgpv\") pod \"ceilometer-0\" (UID: \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\") " pod="openstack/ceilometer-0" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.545718 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-config\") pod \"dnsmasq-dns-7987f74bbc-26ssz\" (UID: \"6aac0af2-647b-4f69-b07a-96bdd62c6e3d\") " pod="openstack/dnsmasq-dns-7987f74bbc-26ssz" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.545743 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1aa8400d-0d21-46be-8bb9-3dd4996d8500-log-httpd\") pod \"ceilometer-0\" (UID: \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\") " pod="openstack/ceilometer-0" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.545766 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-26ssz\" (UID: \"6aac0af2-647b-4f69-b07a-96bdd62c6e3d\") " pod="openstack/dnsmasq-dns-7987f74bbc-26ssz" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.545798 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-26ssz\" (UID: \"6aac0af2-647b-4f69-b07a-96bdd62c6e3d\") " pod="openstack/dnsmasq-dns-7987f74bbc-26ssz" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.545829 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa8400d-0d21-46be-8bb9-3dd4996d8500-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\") " pod="openstack/ceilometer-0" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.545861 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f0b1428-3468-4e47-939d-8614d302bd75-combined-ca-bundle\") pod \"neutron-db-sync-xcrnm\" (UID: \"1f0b1428-3468-4e47-939d-8614d302bd75\") " pod="openstack/neutron-db-sync-xcrnm" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.545904 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q9ws\" (UniqueName: \"kubernetes.io/projected/1f0b1428-3468-4e47-939d-8614d302bd75-kube-api-access-6q9ws\") pod \"neutron-db-sync-xcrnm\" (UID: \"1f0b1428-3468-4e47-939d-8614d302bd75\") " pod="openstack/neutron-db-sync-xcrnm" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.546466 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1aa8400d-0d21-46be-8bb9-3dd4996d8500-run-httpd\") pod \"ceilometer-0\" (UID: \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\") " pod="openstack/ceilometer-0" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.549941 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-ptbfk" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.551294 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1aa8400d-0d21-46be-8bb9-3dd4996d8500-log-httpd\") pod \"ceilometer-0\" (UID: \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\") " pod="openstack/ceilometer-0" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.552459 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.554013 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa8400d-0d21-46be-8bb9-3dd4996d8500-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\") " pod="openstack/ceilometer-0" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.555153 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1aa8400d-0d21-46be-8bb9-3dd4996d8500-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\") " pod="openstack/ceilometer-0" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.561101 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aa8400d-0d21-46be-8bb9-3dd4996d8500-scripts\") pod \"ceilometer-0\" (UID: \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\") " pod="openstack/ceilometer-0" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.564739 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-hkl2c"] Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.600290 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa8400d-0d21-46be-8bb9-3dd4996d8500-config-data\") pod \"ceilometer-0\" (UID: \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\") " pod="openstack/ceilometer-0" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.666078 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh7zl\" (UniqueName: \"kubernetes.io/projected/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-kube-api-access-qh7zl\") pod \"dnsmasq-dns-7987f74bbc-26ssz\" (UID: \"6aac0af2-647b-4f69-b07a-96bdd62c6e3d\") " pod="openstack/dnsmasq-dns-7987f74bbc-26ssz" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.666154 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b1801b5-50f2-40fc-9e14-386216a4418c-combined-ca-bundle\") pod \"placement-db-sync-xkpk5\" (UID: \"8b1801b5-50f2-40fc-9e14-386216a4418c\") " pod="openstack/placement-db-sync-xkpk5" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.666248 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdqvc\" (UniqueName: \"kubernetes.io/projected/4d086846-e371-492b-81c2-0fb443b14f30-kube-api-access-bdqvc\") pod \"barbican-db-sync-hkl2c\" (UID: \"4d086846-e371-492b-81c2-0fb443b14f30\") " pod="openstack/barbican-db-sync-hkl2c" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.667331 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxgpv\" (UniqueName: \"kubernetes.io/projected/1aa8400d-0d21-46be-8bb9-3dd4996d8500-kube-api-access-dxgpv\") pod \"ceilometer-0\" (UID: \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\") " pod="openstack/ceilometer-0" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.667359 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b1801b5-50f2-40fc-9e14-386216a4418c-config-data\") pod \"placement-db-sync-xkpk5\" (UID: \"8b1801b5-50f2-40fc-9e14-386216a4418c\") " pod="openstack/placement-db-sync-xkpk5" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.667450 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b1801b5-50f2-40fc-9e14-386216a4418c-logs\") pod \"placement-db-sync-xkpk5\" (UID: \"8b1801b5-50f2-40fc-9e14-386216a4418c\") " pod="openstack/placement-db-sync-xkpk5" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.667491 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4d086846-e371-492b-81c2-0fb443b14f30-db-sync-config-data\") pod \"barbican-db-sync-hkl2c\" (UID: \"4d086846-e371-492b-81c2-0fb443b14f30\") " pod="openstack/barbican-db-sync-hkl2c" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.667518 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-config\") pod \"dnsmasq-dns-7987f74bbc-26ssz\" (UID: \"6aac0af2-647b-4f69-b07a-96bdd62c6e3d\") " pod="openstack/dnsmasq-dns-7987f74bbc-26ssz" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.667539 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-26ssz\" (UID: \"6aac0af2-647b-4f69-b07a-96bdd62c6e3d\") " pod="openstack/dnsmasq-dns-7987f74bbc-26ssz" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.667573 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-26ssz\" (UID: \"6aac0af2-647b-4f69-b07a-96bdd62c6e3d\") " pod="openstack/dnsmasq-dns-7987f74bbc-26ssz" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.667622 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f0b1428-3468-4e47-939d-8614d302bd75-combined-ca-bundle\") pod \"neutron-db-sync-xcrnm\" (UID: \"1f0b1428-3468-4e47-939d-8614d302bd75\") " pod="openstack/neutron-db-sync-xcrnm" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.667684 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q9ws\" (UniqueName: \"kubernetes.io/projected/1f0b1428-3468-4e47-939d-8614d302bd75-kube-api-access-6q9ws\") pod \"neutron-db-sync-xcrnm\" (UID: \"1f0b1428-3468-4e47-939d-8614d302bd75\") " pod="openstack/neutron-db-sync-xcrnm" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.667710 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f0b1428-3468-4e47-939d-8614d302bd75-config\") pod \"neutron-db-sync-xcrnm\" (UID: \"1f0b1428-3468-4e47-939d-8614d302bd75\") " pod="openstack/neutron-db-sync-xcrnm" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.677851 4802 generic.go:334] "Generic (PLEG): container finished" podID="a30a5486-cc7c-44be-bad6-5491a8003956" containerID="821e8dde9993f85285de60639d12ed6876ac696e570eb017f3ecd3349d423dd3" exitCode=0 Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.678000 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-j4dg5" event={"ID":"a30a5486-cc7c-44be-bad6-5491a8003956","Type":"ContainerDied","Data":"821e8dde9993f85285de60639d12ed6876ac696e570eb017f3ecd3349d423dd3"} Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.682427 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b1801b5-50f2-40fc-9e14-386216a4418c-logs\") pod \"placement-db-sync-xkpk5\" (UID: \"8b1801b5-50f2-40fc-9e14-386216a4418c\") " pod="openstack/placement-db-sync-xkpk5" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.682570 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b1801b5-50f2-40fc-9e14-386216a4418c-config-data\") pod \"placement-db-sync-xkpk5\" (UID: \"8b1801b5-50f2-40fc-9e14-386216a4418c\") " pod="openstack/placement-db-sync-xkpk5" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.682597 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b1801b5-50f2-40fc-9e14-386216a4418c-combined-ca-bundle\") pod \"placement-db-sync-xkpk5\" (UID: \"8b1801b5-50f2-40fc-9e14-386216a4418c\") " pod="openstack/placement-db-sync-xkpk5" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.684703 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-config\") pod \"dnsmasq-dns-7987f74bbc-26ssz\" (UID: \"6aac0af2-647b-4f69-b07a-96bdd62c6e3d\") " pod="openstack/dnsmasq-dns-7987f74bbc-26ssz" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.701660 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.701824 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-p259k" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.702109 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b1801b5-50f2-40fc-9e14-386216a4418c-scripts\") pod \"placement-db-sync-xkpk5\" (UID: \"8b1801b5-50f2-40fc-9e14-386216a4418c\") " pod="openstack/placement-db-sync-xkpk5" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.702149 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm2sl\" (UniqueName: \"kubernetes.io/projected/8b1801b5-50f2-40fc-9e14-386216a4418c-kube-api-access-nm2sl\") pod \"placement-db-sync-xkpk5\" (UID: \"8b1801b5-50f2-40fc-9e14-386216a4418c\") " pod="openstack/placement-db-sync-xkpk5" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.702589 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-26ssz\" (UID: \"6aac0af2-647b-4f69-b07a-96bdd62c6e3d\") " pod="openstack/dnsmasq-dns-7987f74bbc-26ssz" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.702652 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh7zl\" (UniqueName: \"kubernetes.io/projected/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-kube-api-access-qh7zl\") pod \"dnsmasq-dns-7987f74bbc-26ssz\" (UID: \"6aac0af2-647b-4f69-b07a-96bdd62c6e3d\") " pod="openstack/dnsmasq-dns-7987f74bbc-26ssz" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.702889 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-26ssz\" (UID: \"6aac0af2-647b-4f69-b07a-96bdd62c6e3d\") " pod="openstack/dnsmasq-dns-7987f74bbc-26ssz" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.702948 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d086846-e371-492b-81c2-0fb443b14f30-combined-ca-bundle\") pod \"barbican-db-sync-hkl2c\" (UID: \"4d086846-e371-492b-81c2-0fb443b14f30\") " pod="openstack/barbican-db-sync-hkl2c" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.715943 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f0b1428-3468-4e47-939d-8614d302bd75-combined-ca-bundle\") pod \"neutron-db-sync-xcrnm\" (UID: \"1f0b1428-3468-4e47-939d-8614d302bd75\") " pod="openstack/neutron-db-sync-xcrnm" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.720656 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-26ssz\" (UID: \"6aac0af2-647b-4f69-b07a-96bdd62c6e3d\") " pod="openstack/dnsmasq-dns-7987f74bbc-26ssz" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.723790 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f0b1428-3468-4e47-939d-8614d302bd75-config\") pod \"neutron-db-sync-xcrnm\" (UID: \"1f0b1428-3468-4e47-939d-8614d302bd75\") " pod="openstack/neutron-db-sync-xcrnm" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.726381 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b1801b5-50f2-40fc-9e14-386216a4418c-scripts\") pod \"placement-db-sync-xkpk5\" (UID: \"8b1801b5-50f2-40fc-9e14-386216a4418c\") " pod="openstack/placement-db-sync-xkpk5" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.726558 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q9ws\" (UniqueName: \"kubernetes.io/projected/1f0b1428-3468-4e47-939d-8614d302bd75-kube-api-access-6q9ws\") pod \"neutron-db-sync-xcrnm\" (UID: \"1f0b1428-3468-4e47-939d-8614d302bd75\") " pod="openstack/neutron-db-sync-xcrnm" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.729691 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-26ssz\" (UID: \"6aac0af2-647b-4f69-b07a-96bdd62c6e3d\") " pod="openstack/dnsmasq-dns-7987f74bbc-26ssz" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.730239 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm2sl\" (UniqueName: \"kubernetes.io/projected/8b1801b5-50f2-40fc-9e14-386216a4418c-kube-api-access-nm2sl\") pod \"placement-db-sync-xkpk5\" (UID: \"8b1801b5-50f2-40fc-9e14-386216a4418c\") " pod="openstack/placement-db-sync-xkpk5" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.748090 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-xcrnm" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.783850 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xkpk5" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.806283 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d086846-e371-492b-81c2-0fb443b14f30-combined-ca-bundle\") pod \"barbican-db-sync-hkl2c\" (UID: \"4d086846-e371-492b-81c2-0fb443b14f30\") " pod="openstack/barbican-db-sync-hkl2c" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.806381 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdqvc\" (UniqueName: \"kubernetes.io/projected/4d086846-e371-492b-81c2-0fb443b14f30-kube-api-access-bdqvc\") pod \"barbican-db-sync-hkl2c\" (UID: \"4d086846-e371-492b-81c2-0fb443b14f30\") " pod="openstack/barbican-db-sync-hkl2c" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.806429 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4d086846-e371-492b-81c2-0fb443b14f30-db-sync-config-data\") pod \"barbican-db-sync-hkl2c\" (UID: \"4d086846-e371-492b-81c2-0fb443b14f30\") " pod="openstack/barbican-db-sync-hkl2c" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.823802 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-26ssz" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.843463 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdqvc\" (UniqueName: \"kubernetes.io/projected/4d086846-e371-492b-81c2-0fb443b14f30-kube-api-access-bdqvc\") pod \"barbican-db-sync-hkl2c\" (UID: \"4d086846-e371-492b-81c2-0fb443b14f30\") " pod="openstack/barbican-db-sync-hkl2c" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.854980 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d086846-e371-492b-81c2-0fb443b14f30-combined-ca-bundle\") pod \"barbican-db-sync-hkl2c\" (UID: \"4d086846-e371-492b-81c2-0fb443b14f30\") " pod="openstack/barbican-db-sync-hkl2c" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.855359 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4d086846-e371-492b-81c2-0fb443b14f30-db-sync-config-data\") pod \"barbican-db-sync-hkl2c\" (UID: \"4d086846-e371-492b-81c2-0fb443b14f30\") " pod="openstack/barbican-db-sync-hkl2c" Dec 01 20:17:28 crc kubenswrapper[4802]: I1201 20:17:28.952556 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hkl2c" Dec 01 20:17:29 crc kubenswrapper[4802]: I1201 20:17:29.080463 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-svwp7"] Dec 01 20:17:29 crc kubenswrapper[4802]: I1201 20:17:29.101442 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-sc66v"] Dec 01 20:17:29 crc kubenswrapper[4802]: I1201 20:17:29.702829 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sc66v" event={"ID":"5dcd23fe-79d2-435b-be8d-5bf26404ad0a","Type":"ContainerStarted","Data":"579401e1051ba58c5e9e4efd000ea6777646162f95955108bf93a6abab07e95b"} Dec 01 20:17:29 crc kubenswrapper[4802]: I1201 20:17:29.703377 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sc66v" event={"ID":"5dcd23fe-79d2-435b-be8d-5bf26404ad0a","Type":"ContainerStarted","Data":"ae5c0bd185ed8ba047d1d57969dabfc18170db341d9b088cc8ed6f3190d904a7"} Dec 01 20:17:29 crc kubenswrapper[4802]: I1201 20:17:29.705460 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-svwp7" event={"ID":"63a96c12-9250-4882-9fb1-3de6abe702f3","Type":"ContainerStarted","Data":"c3a830f6af060141e15777cf44758f74fcec6a825da97af69baa5b24e9195d75"} Dec 01 20:17:29 crc kubenswrapper[4802]: I1201 20:17:29.705490 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-svwp7" event={"ID":"63a96c12-9250-4882-9fb1-3de6abe702f3","Type":"ContainerStarted","Data":"290b7772847b6689651fa5d09ab5646fccadacefd4fcc635236ba4483971461c"} Dec 01 20:17:29 crc kubenswrapper[4802]: I1201 20:17:29.732065 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-sc66v" podStartSLOduration=2.732040222 podStartE2EDuration="2.732040222s" podCreationTimestamp="2025-12-01 20:17:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:17:29.729218153 +0000 UTC m=+1271.291777804" watchObservedRunningTime="2025-12-01 20:17:29.732040222 +0000 UTC m=+1271.294599883" Dec 01 20:17:29 crc kubenswrapper[4802]: I1201 20:17:29.945891 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-p259k"] Dec 01 20:17:29 crc kubenswrapper[4802]: I1201 20:17:29.966915 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-xcrnm"] Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.011376 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.089170 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-j4dg5" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.140320 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a30a5486-cc7c-44be-bad6-5491a8003956-ovsdbserver-nb\") pod \"a30a5486-cc7c-44be-bad6-5491a8003956\" (UID: \"a30a5486-cc7c-44be-bad6-5491a8003956\") " Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.140471 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a30a5486-cc7c-44be-bad6-5491a8003956-ovsdbserver-sb\") pod \"a30a5486-cc7c-44be-bad6-5491a8003956\" (UID: \"a30a5486-cc7c-44be-bad6-5491a8003956\") " Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.140507 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a30a5486-cc7c-44be-bad6-5491a8003956-config\") pod \"a30a5486-cc7c-44be-bad6-5491a8003956\" (UID: \"a30a5486-cc7c-44be-bad6-5491a8003956\") " Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.140589 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsd4x\" (UniqueName: \"kubernetes.io/projected/a30a5486-cc7c-44be-bad6-5491a8003956-kube-api-access-tsd4x\") pod \"a30a5486-cc7c-44be-bad6-5491a8003956\" (UID: \"a30a5486-cc7c-44be-bad6-5491a8003956\") " Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.140648 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a30a5486-cc7c-44be-bad6-5491a8003956-dns-svc\") pod \"a30a5486-cc7c-44be-bad6-5491a8003956\" (UID: \"a30a5486-cc7c-44be-bad6-5491a8003956\") " Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.174952 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-xkpk5"] Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.178557 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a30a5486-cc7c-44be-bad6-5491a8003956-kube-api-access-tsd4x" (OuterVolumeSpecName: "kube-api-access-tsd4x") pod "a30a5486-cc7c-44be-bad6-5491a8003956" (UID: "a30a5486-cc7c-44be-bad6-5491a8003956"). InnerVolumeSpecName "kube-api-access-tsd4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.199129 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a30a5486-cc7c-44be-bad6-5491a8003956-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a30a5486-cc7c-44be-bad6-5491a8003956" (UID: "a30a5486-cc7c-44be-bad6-5491a8003956"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.213753 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a30a5486-cc7c-44be-bad6-5491a8003956-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a30a5486-cc7c-44be-bad6-5491a8003956" (UID: "a30a5486-cc7c-44be-bad6-5491a8003956"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.234543 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a30a5486-cc7c-44be-bad6-5491a8003956-config" (OuterVolumeSpecName: "config") pod "a30a5486-cc7c-44be-bad6-5491a8003956" (UID: "a30a5486-cc7c-44be-bad6-5491a8003956"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.240769 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a30a5486-cc7c-44be-bad6-5491a8003956-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a30a5486-cc7c-44be-bad6-5491a8003956" (UID: "a30a5486-cc7c-44be-bad6-5491a8003956"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.248079 4802 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a30a5486-cc7c-44be-bad6-5491a8003956-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.248112 4802 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a30a5486-cc7c-44be-bad6-5491a8003956-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.248126 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a30a5486-cc7c-44be-bad6-5491a8003956-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.248140 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsd4x\" (UniqueName: \"kubernetes.io/projected/a30a5486-cc7c-44be-bad6-5491a8003956-kube-api-access-tsd4x\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.248153 4802 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a30a5486-cc7c-44be-bad6-5491a8003956-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.313299 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-hkl2c"] Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.354319 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-26ssz"] Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.355413 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-svwp7" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.451727 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63a96c12-9250-4882-9fb1-3de6abe702f3-ovsdbserver-nb\") pod \"63a96c12-9250-4882-9fb1-3de6abe702f3\" (UID: \"63a96c12-9250-4882-9fb1-3de6abe702f3\") " Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.451770 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63a96c12-9250-4882-9fb1-3de6abe702f3-config\") pod \"63a96c12-9250-4882-9fb1-3de6abe702f3\" (UID: \"63a96c12-9250-4882-9fb1-3de6abe702f3\") " Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.451887 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/63a96c12-9250-4882-9fb1-3de6abe702f3-ovsdbserver-sb\") pod \"63a96c12-9250-4882-9fb1-3de6abe702f3\" (UID: \"63a96c12-9250-4882-9fb1-3de6abe702f3\") " Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.451990 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ffgz\" (UniqueName: \"kubernetes.io/projected/63a96c12-9250-4882-9fb1-3de6abe702f3-kube-api-access-2ffgz\") pod \"63a96c12-9250-4882-9fb1-3de6abe702f3\" (UID: \"63a96c12-9250-4882-9fb1-3de6abe702f3\") " Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.452037 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63a96c12-9250-4882-9fb1-3de6abe702f3-dns-svc\") pod \"63a96c12-9250-4882-9fb1-3de6abe702f3\" (UID: \"63a96c12-9250-4882-9fb1-3de6abe702f3\") " Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.459427 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63a96c12-9250-4882-9fb1-3de6abe702f3-kube-api-access-2ffgz" (OuterVolumeSpecName: "kube-api-access-2ffgz") pod "63a96c12-9250-4882-9fb1-3de6abe702f3" (UID: "63a96c12-9250-4882-9fb1-3de6abe702f3"). InnerVolumeSpecName "kube-api-access-2ffgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.485776 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63a96c12-9250-4882-9fb1-3de6abe702f3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "63a96c12-9250-4882-9fb1-3de6abe702f3" (UID: "63a96c12-9250-4882-9fb1-3de6abe702f3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.487431 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63a96c12-9250-4882-9fb1-3de6abe702f3-config" (OuterVolumeSpecName: "config") pod "63a96c12-9250-4882-9fb1-3de6abe702f3" (UID: "63a96c12-9250-4882-9fb1-3de6abe702f3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.506951 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63a96c12-9250-4882-9fb1-3de6abe702f3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "63a96c12-9250-4882-9fb1-3de6abe702f3" (UID: "63a96c12-9250-4882-9fb1-3de6abe702f3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.526136 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63a96c12-9250-4882-9fb1-3de6abe702f3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "63a96c12-9250-4882-9fb1-3de6abe702f3" (UID: "63a96c12-9250-4882-9fb1-3de6abe702f3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.554495 4802 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63a96c12-9250-4882-9fb1-3de6abe702f3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.554530 4802 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63a96c12-9250-4882-9fb1-3de6abe702f3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.554541 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63a96c12-9250-4882-9fb1-3de6abe702f3-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.554550 4802 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/63a96c12-9250-4882-9fb1-3de6abe702f3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.554561 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ffgz\" (UniqueName: \"kubernetes.io/projected/63a96c12-9250-4882-9fb1-3de6abe702f3-kube-api-access-2ffgz\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.621071 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.717060 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hkl2c" event={"ID":"4d086846-e371-492b-81c2-0fb443b14f30","Type":"ContainerStarted","Data":"5f816cb09c113db6299ca74920a84bc729d3c840d7ad1f95cc461245426ae3da"} Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.732513 4802 generic.go:334] "Generic (PLEG): container finished" podID="63a96c12-9250-4882-9fb1-3de6abe702f3" containerID="c3a830f6af060141e15777cf44758f74fcec6a825da97af69baa5b24e9195d75" exitCode=0 Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.732613 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-svwp7" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.737794 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-xcrnm" event={"ID":"1f0b1428-3468-4e47-939d-8614d302bd75","Type":"ContainerStarted","Data":"b960e6fc96accaaf52672c959daf9ccdcd9204d9deb6a549e9947c5e12659930"} Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.737833 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-xcrnm" event={"ID":"1f0b1428-3468-4e47-939d-8614d302bd75","Type":"ContainerStarted","Data":"347c575746a70dfa0ac025397675a29123a49109d9ec1d9e8d352c05e93af8a8"} Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.737843 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-p259k" event={"ID":"93f6c930-0ed7-480e-8725-692427ba2b9d","Type":"ContainerStarted","Data":"ab807c489c268d8ed1c172665d567f9bc6af8e781f2a03fed8d1dcdb5c2add4c"} Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.737853 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-svwp7" event={"ID":"63a96c12-9250-4882-9fb1-3de6abe702f3","Type":"ContainerDied","Data":"c3a830f6af060141e15777cf44758f74fcec6a825da97af69baa5b24e9195d75"} Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.737866 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-svwp7" event={"ID":"63a96c12-9250-4882-9fb1-3de6abe702f3","Type":"ContainerDied","Data":"290b7772847b6689651fa5d09ab5646fccadacefd4fcc635236ba4483971461c"} Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.737884 4802 scope.go:117] "RemoveContainer" containerID="c3a830f6af060141e15777cf44758f74fcec6a825da97af69baa5b24e9195d75" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.745371 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-j4dg5" event={"ID":"a30a5486-cc7c-44be-bad6-5491a8003956","Type":"ContainerDied","Data":"51fb48e49473ef4d1f22dc09812eab250fc21be0797126552b2b5340d630aa77"} Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.745690 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-j4dg5" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.779932 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xkpk5" event={"ID":"8b1801b5-50f2-40fc-9e14-386216a4418c","Type":"ContainerStarted","Data":"8eed099f81b4c4011037b04734b2f27be0b51764b2bfa395f4cb3863804b8c94"} Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.782017 4802 generic.go:334] "Generic (PLEG): container finished" podID="6aac0af2-647b-4f69-b07a-96bdd62c6e3d" containerID="d47e3f4934f8a4f27df129b98048a3eba034eee49377d00ebe202853d1d06889" exitCode=0 Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.782097 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-26ssz" event={"ID":"6aac0af2-647b-4f69-b07a-96bdd62c6e3d","Type":"ContainerDied","Data":"d47e3f4934f8a4f27df129b98048a3eba034eee49377d00ebe202853d1d06889"} Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.782120 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-26ssz" event={"ID":"6aac0af2-647b-4f69-b07a-96bdd62c6e3d","Type":"ContainerStarted","Data":"5822552c8c60ecf7bd611f02b69bac7cd2320c8e11ebccc0cec3dab795cc7afc"} Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.787965 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1aa8400d-0d21-46be-8bb9-3dd4996d8500","Type":"ContainerStarted","Data":"bc96652feadfde98d95e120e56ca086d2bc4fb2ba1f2854522c7091a9abd1849"} Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.794442 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-xcrnm" podStartSLOduration=2.794421624 podStartE2EDuration="2.794421624s" podCreationTimestamp="2025-12-01 20:17:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:17:30.763676157 +0000 UTC m=+1272.326235808" watchObservedRunningTime="2025-12-01 20:17:30.794421624 +0000 UTC m=+1272.356981265" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.808600 4802 scope.go:117] "RemoveContainer" containerID="c3a830f6af060141e15777cf44758f74fcec6a825da97af69baa5b24e9195d75" Dec 01 20:17:30 crc kubenswrapper[4802]: E1201 20:17:30.815997 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3a830f6af060141e15777cf44758f74fcec6a825da97af69baa5b24e9195d75\": container with ID starting with c3a830f6af060141e15777cf44758f74fcec6a825da97af69baa5b24e9195d75 not found: ID does not exist" containerID="c3a830f6af060141e15777cf44758f74fcec6a825da97af69baa5b24e9195d75" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.816119 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3a830f6af060141e15777cf44758f74fcec6a825da97af69baa5b24e9195d75"} err="failed to get container status \"c3a830f6af060141e15777cf44758f74fcec6a825da97af69baa5b24e9195d75\": rpc error: code = NotFound desc = could not find container \"c3a830f6af060141e15777cf44758f74fcec6a825da97af69baa5b24e9195d75\": container with ID starting with c3a830f6af060141e15777cf44758f74fcec6a825da97af69baa5b24e9195d75 not found: ID does not exist" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.816147 4802 scope.go:117] "RemoveContainer" containerID="821e8dde9993f85285de60639d12ed6876ac696e570eb017f3ecd3349d423dd3" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.893508 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-svwp7"] Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.909405 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-svwp7"] Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.910144 4802 scope.go:117] "RemoveContainer" containerID="2a7331c3d03d6429a7a416dd9b872f5bdf6549452f33926d0bb69d137cabb1bc" Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.925941 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-j4dg5"] Dec 01 20:17:30 crc kubenswrapper[4802]: I1201 20:17:30.934655 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-j4dg5"] Dec 01 20:17:31 crc kubenswrapper[4802]: I1201 20:17:31.816939 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-26ssz" event={"ID":"6aac0af2-647b-4f69-b07a-96bdd62c6e3d","Type":"ContainerStarted","Data":"2a15361e2bc6d911d2d8f09c6971d0d8e01c81e641e2303058f26a9346425b49"} Dec 01 20:17:31 crc kubenswrapper[4802]: I1201 20:17:31.818148 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7987f74bbc-26ssz" Dec 01 20:17:32 crc kubenswrapper[4802]: I1201 20:17:32.017626 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7987f74bbc-26ssz" podStartSLOduration=4.017610214 podStartE2EDuration="4.017610214s" podCreationTimestamp="2025-12-01 20:17:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:17:32.014805076 +0000 UTC m=+1273.577364727" watchObservedRunningTime="2025-12-01 20:17:32.017610214 +0000 UTC m=+1273.580169855" Dec 01 20:17:32 crc kubenswrapper[4802]: I1201 20:17:32.733039 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63a96c12-9250-4882-9fb1-3de6abe702f3" path="/var/lib/kubelet/pods/63a96c12-9250-4882-9fb1-3de6abe702f3/volumes" Dec 01 20:17:32 crc kubenswrapper[4802]: I1201 20:17:32.734096 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a30a5486-cc7c-44be-bad6-5491a8003956" path="/var/lib/kubelet/pods/a30a5486-cc7c-44be-bad6-5491a8003956/volumes" Dec 01 20:17:38 crc kubenswrapper[4802]: I1201 20:17:38.886657 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7987f74bbc-26ssz" Dec 01 20:17:39 crc kubenswrapper[4802]: I1201 20:17:39.145853 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-kw24b"] Dec 01 20:17:39 crc kubenswrapper[4802]: I1201 20:17:39.146144 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" podUID="415afd0d-1968-4f65-b7ed-6d4acbde81c9" containerName="dnsmasq-dns" containerID="cri-o://4e9df40859e0444f99fb424b38a9c4c02ab36d8eff9728b2df85e6162ce4ed5d" gracePeriod=10 Dec 01 20:17:39 crc kubenswrapper[4802]: I1201 20:17:39.929889 4802 generic.go:334] "Generic (PLEG): container finished" podID="415afd0d-1968-4f65-b7ed-6d4acbde81c9" containerID="4e9df40859e0444f99fb424b38a9c4c02ab36d8eff9728b2df85e6162ce4ed5d" exitCode=0 Dec 01 20:17:39 crc kubenswrapper[4802]: I1201 20:17:39.929951 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" event={"ID":"415afd0d-1968-4f65-b7ed-6d4acbde81c9","Type":"ContainerDied","Data":"4e9df40859e0444f99fb424b38a9c4c02ab36d8eff9728b2df85e6162ce4ed5d"} Dec 01 20:17:40 crc kubenswrapper[4802]: I1201 20:17:40.374668 4802 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" podUID="415afd0d-1968-4f65-b7ed-6d4acbde81c9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: connect: connection refused" Dec 01 20:17:45 crc kubenswrapper[4802]: I1201 20:17:45.997118 4802 generic.go:334] "Generic (PLEG): container finished" podID="5dcd23fe-79d2-435b-be8d-5bf26404ad0a" containerID="579401e1051ba58c5e9e4efd000ea6777646162f95955108bf93a6abab07e95b" exitCode=0 Dec 01 20:17:45 crc kubenswrapper[4802]: I1201 20:17:45.997239 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sc66v" event={"ID":"5dcd23fe-79d2-435b-be8d-5bf26404ad0a","Type":"ContainerDied","Data":"579401e1051ba58c5e9e4efd000ea6777646162f95955108bf93a6abab07e95b"} Dec 01 20:17:46 crc kubenswrapper[4802]: E1201 20:17:46.455799 4802 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Dec 01 20:17:46 crc kubenswrapper[4802]: E1201 20:17:46.456034 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nm2sl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-xkpk5_openstack(8b1801b5-50f2-40fc-9e14-386216a4418c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 20:17:46 crc kubenswrapper[4802]: E1201 20:17:46.457499 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-xkpk5" podUID="8b1801b5-50f2-40fc-9e14-386216a4418c" Dec 01 20:17:47 crc kubenswrapper[4802]: E1201 20:17:47.010606 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-xkpk5" podUID="8b1801b5-50f2-40fc-9e14-386216a4418c" Dec 01 20:17:47 crc kubenswrapper[4802]: I1201 20:17:47.273757 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" Dec 01 20:17:47 crc kubenswrapper[4802]: I1201 20:17:47.294783 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/415afd0d-1968-4f65-b7ed-6d4acbde81c9-ovsdbserver-nb\") pod \"415afd0d-1968-4f65-b7ed-6d4acbde81c9\" (UID: \"415afd0d-1968-4f65-b7ed-6d4acbde81c9\") " Dec 01 20:17:47 crc kubenswrapper[4802]: I1201 20:17:47.294880 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/415afd0d-1968-4f65-b7ed-6d4acbde81c9-ovsdbserver-sb\") pod \"415afd0d-1968-4f65-b7ed-6d4acbde81c9\" (UID: \"415afd0d-1968-4f65-b7ed-6d4acbde81c9\") " Dec 01 20:17:47 crc kubenswrapper[4802]: I1201 20:17:47.294927 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2fzr\" (UniqueName: \"kubernetes.io/projected/415afd0d-1968-4f65-b7ed-6d4acbde81c9-kube-api-access-z2fzr\") pod \"415afd0d-1968-4f65-b7ed-6d4acbde81c9\" (UID: \"415afd0d-1968-4f65-b7ed-6d4acbde81c9\") " Dec 01 20:17:47 crc kubenswrapper[4802]: E1201 20:17:47.335751 4802 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Dec 01 20:17:47 crc kubenswrapper[4802]: E1201 20:17:47.335940 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5cdh579h599h5ch77hd5h58h697h59hchf9h7ch64h9hfchd6h594h5c9h576h5d5hb7h57fh5ch666h9fh597h65ch7h56bh5d9h64bh576q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dxgpv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(1aa8400d-0d21-46be-8bb9-3dd4996d8500): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 20:17:47 crc kubenswrapper[4802]: I1201 20:17:47.336368 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/415afd0d-1968-4f65-b7ed-6d4acbde81c9-kube-api-access-z2fzr" (OuterVolumeSpecName: "kube-api-access-z2fzr") pod "415afd0d-1968-4f65-b7ed-6d4acbde81c9" (UID: "415afd0d-1968-4f65-b7ed-6d4acbde81c9"). InnerVolumeSpecName "kube-api-access-z2fzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:17:47 crc kubenswrapper[4802]: I1201 20:17:47.350943 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/415afd0d-1968-4f65-b7ed-6d4acbde81c9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "415afd0d-1968-4f65-b7ed-6d4acbde81c9" (UID: "415afd0d-1968-4f65-b7ed-6d4acbde81c9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:17:47 crc kubenswrapper[4802]: I1201 20:17:47.353343 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/415afd0d-1968-4f65-b7ed-6d4acbde81c9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "415afd0d-1968-4f65-b7ed-6d4acbde81c9" (UID: "415afd0d-1968-4f65-b7ed-6d4acbde81c9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:17:47 crc kubenswrapper[4802]: I1201 20:17:47.396876 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/415afd0d-1968-4f65-b7ed-6d4acbde81c9-dns-svc\") pod \"415afd0d-1968-4f65-b7ed-6d4acbde81c9\" (UID: \"415afd0d-1968-4f65-b7ed-6d4acbde81c9\") " Dec 01 20:17:47 crc kubenswrapper[4802]: I1201 20:17:47.396937 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/415afd0d-1968-4f65-b7ed-6d4acbde81c9-config\") pod \"415afd0d-1968-4f65-b7ed-6d4acbde81c9\" (UID: \"415afd0d-1968-4f65-b7ed-6d4acbde81c9\") " Dec 01 20:17:47 crc kubenswrapper[4802]: I1201 20:17:47.397255 4802 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/415afd0d-1968-4f65-b7ed-6d4acbde81c9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:47 crc kubenswrapper[4802]: I1201 20:17:47.397271 4802 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/415afd0d-1968-4f65-b7ed-6d4acbde81c9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:47 crc kubenswrapper[4802]: I1201 20:17:47.397284 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2fzr\" (UniqueName: \"kubernetes.io/projected/415afd0d-1968-4f65-b7ed-6d4acbde81c9-kube-api-access-z2fzr\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:47 crc kubenswrapper[4802]: I1201 20:17:47.447601 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/415afd0d-1968-4f65-b7ed-6d4acbde81c9-config" (OuterVolumeSpecName: "config") pod "415afd0d-1968-4f65-b7ed-6d4acbde81c9" (UID: "415afd0d-1968-4f65-b7ed-6d4acbde81c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:17:47 crc kubenswrapper[4802]: I1201 20:17:47.449655 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/415afd0d-1968-4f65-b7ed-6d4acbde81c9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "415afd0d-1968-4f65-b7ed-6d4acbde81c9" (UID: "415afd0d-1968-4f65-b7ed-6d4acbde81c9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:17:47 crc kubenswrapper[4802]: I1201 20:17:47.498425 4802 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/415afd0d-1968-4f65-b7ed-6d4acbde81c9-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:47 crc kubenswrapper[4802]: I1201 20:17:47.498460 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/415afd0d-1968-4f65-b7ed-6d4acbde81c9-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:48 crc kubenswrapper[4802]: I1201 20:17:48.022744 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" event={"ID":"415afd0d-1968-4f65-b7ed-6d4acbde81c9","Type":"ContainerDied","Data":"c1190ca03c369d4bfe2f2e9605bea238c993267b04bfa0768a34b1505f1fc8b0"} Dec 01 20:17:48 crc kubenswrapper[4802]: I1201 20:17:48.023380 4802 scope.go:117] "RemoveContainer" containerID="4e9df40859e0444f99fb424b38a9c4c02ab36d8eff9728b2df85e6162ce4ed5d" Dec 01 20:17:48 crc kubenswrapper[4802]: I1201 20:17:48.022789 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" Dec 01 20:17:48 crc kubenswrapper[4802]: I1201 20:17:48.065635 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-kw24b"] Dec 01 20:17:48 crc kubenswrapper[4802]: I1201 20:17:48.072465 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-kw24b"] Dec 01 20:17:48 crc kubenswrapper[4802]: I1201 20:17:48.748676 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="415afd0d-1968-4f65-b7ed-6d4acbde81c9" path="/var/lib/kubelet/pods/415afd0d-1968-4f65-b7ed-6d4acbde81c9/volumes" Dec 01 20:17:50 crc kubenswrapper[4802]: I1201 20:17:50.374423 4802 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-kw24b" podUID="415afd0d-1968-4f65-b7ed-6d4acbde81c9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: i/o timeout" Dec 01 20:17:57 crc kubenswrapper[4802]: E1201 20:17:57.550451 4802 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 01 20:17:57 crc kubenswrapper[4802]: E1201 20:17:57.550937 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bdqvc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-hkl2c_openstack(4d086846-e371-492b-81c2-0fb443b14f30): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 20:17:57 crc kubenswrapper[4802]: E1201 20:17:57.552773 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-hkl2c" podUID="4d086846-e371-492b-81c2-0fb443b14f30" Dec 01 20:17:57 crc kubenswrapper[4802]: I1201 20:17:57.614633 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sc66v" Dec 01 20:17:57 crc kubenswrapper[4802]: I1201 20:17:57.764839 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-credential-keys\") pod \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\" (UID: \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\") " Dec 01 20:17:57 crc kubenswrapper[4802]: I1201 20:17:57.765831 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-scripts\") pod \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\" (UID: \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\") " Dec 01 20:17:57 crc kubenswrapper[4802]: I1201 20:17:57.765921 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-config-data\") pod \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\" (UID: \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\") " Dec 01 20:17:57 crc kubenswrapper[4802]: I1201 20:17:57.766163 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc57w\" (UniqueName: \"kubernetes.io/projected/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-kube-api-access-xc57w\") pod \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\" (UID: \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\") " Dec 01 20:17:57 crc kubenswrapper[4802]: I1201 20:17:57.766278 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-combined-ca-bundle\") pod \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\" (UID: \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\") " Dec 01 20:17:57 crc kubenswrapper[4802]: I1201 20:17:57.766390 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-fernet-keys\") pod \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\" (UID: \"5dcd23fe-79d2-435b-be8d-5bf26404ad0a\") " Dec 01 20:17:57 crc kubenswrapper[4802]: I1201 20:17:57.771109 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5dcd23fe-79d2-435b-be8d-5bf26404ad0a" (UID: "5dcd23fe-79d2-435b-be8d-5bf26404ad0a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:17:57 crc kubenswrapper[4802]: I1201 20:17:57.771801 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-scripts" (OuterVolumeSpecName: "scripts") pod "5dcd23fe-79d2-435b-be8d-5bf26404ad0a" (UID: "5dcd23fe-79d2-435b-be8d-5bf26404ad0a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:17:57 crc kubenswrapper[4802]: I1201 20:17:57.772132 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-kube-api-access-xc57w" (OuterVolumeSpecName: "kube-api-access-xc57w") pod "5dcd23fe-79d2-435b-be8d-5bf26404ad0a" (UID: "5dcd23fe-79d2-435b-be8d-5bf26404ad0a"). InnerVolumeSpecName "kube-api-access-xc57w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:17:57 crc kubenswrapper[4802]: I1201 20:17:57.772427 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5dcd23fe-79d2-435b-be8d-5bf26404ad0a" (UID: "5dcd23fe-79d2-435b-be8d-5bf26404ad0a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:17:57 crc kubenswrapper[4802]: I1201 20:17:57.791742 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dcd23fe-79d2-435b-be8d-5bf26404ad0a" (UID: "5dcd23fe-79d2-435b-be8d-5bf26404ad0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:17:57 crc kubenswrapper[4802]: I1201 20:17:57.802551 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-config-data" (OuterVolumeSpecName: "config-data") pod "5dcd23fe-79d2-435b-be8d-5bf26404ad0a" (UID: "5dcd23fe-79d2-435b-be8d-5bf26404ad0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:17:57 crc kubenswrapper[4802]: I1201 20:17:57.868532 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc57w\" (UniqueName: \"kubernetes.io/projected/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-kube-api-access-xc57w\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:57 crc kubenswrapper[4802]: I1201 20:17:57.868576 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:57 crc kubenswrapper[4802]: I1201 20:17:57.868588 4802 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:57 crc kubenswrapper[4802]: I1201 20:17:57.868599 4802 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:57 crc kubenswrapper[4802]: I1201 20:17:57.868610 4802 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:57 crc kubenswrapper[4802]: I1201 20:17:57.868631 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dcd23fe-79d2-435b-be8d-5bf26404ad0a-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.089292 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.089626 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.110781 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sc66v" event={"ID":"5dcd23fe-79d2-435b-be8d-5bf26404ad0a","Type":"ContainerDied","Data":"ae5c0bd185ed8ba047d1d57969dabfc18170db341d9b088cc8ed6f3190d904a7"} Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.110824 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae5c0bd185ed8ba047d1d57969dabfc18170db341d9b088cc8ed6f3190d904a7" Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.110804 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sc66v" Dec 01 20:17:58 crc kubenswrapper[4802]: E1201 20:17:58.113478 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-hkl2c" podUID="4d086846-e371-492b-81c2-0fb443b14f30" Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.760363 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-sc66v"] Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.764699 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-sc66v"] Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.870021 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xpvd9"] Dec 01 20:17:58 crc kubenswrapper[4802]: E1201 20:17:58.870853 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63a96c12-9250-4882-9fb1-3de6abe702f3" containerName="init" Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.870957 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="63a96c12-9250-4882-9fb1-3de6abe702f3" containerName="init" Dec 01 20:17:58 crc kubenswrapper[4802]: E1201 20:17:58.871045 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a30a5486-cc7c-44be-bad6-5491a8003956" containerName="dnsmasq-dns" Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.871121 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="a30a5486-cc7c-44be-bad6-5491a8003956" containerName="dnsmasq-dns" Dec 01 20:17:58 crc kubenswrapper[4802]: E1201 20:17:58.871364 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="415afd0d-1968-4f65-b7ed-6d4acbde81c9" containerName="dnsmasq-dns" Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.871456 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="415afd0d-1968-4f65-b7ed-6d4acbde81c9" containerName="dnsmasq-dns" Dec 01 20:17:58 crc kubenswrapper[4802]: E1201 20:17:58.871545 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="415afd0d-1968-4f65-b7ed-6d4acbde81c9" containerName="init" Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.871629 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="415afd0d-1968-4f65-b7ed-6d4acbde81c9" containerName="init" Dec 01 20:17:58 crc kubenswrapper[4802]: E1201 20:17:58.871711 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dcd23fe-79d2-435b-be8d-5bf26404ad0a" containerName="keystone-bootstrap" Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.871792 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dcd23fe-79d2-435b-be8d-5bf26404ad0a" containerName="keystone-bootstrap" Dec 01 20:17:58 crc kubenswrapper[4802]: E1201 20:17:58.871888 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a30a5486-cc7c-44be-bad6-5491a8003956" containerName="init" Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.871967 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="a30a5486-cc7c-44be-bad6-5491a8003956" containerName="init" Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.872317 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="63a96c12-9250-4882-9fb1-3de6abe702f3" containerName="init" Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.872421 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="415afd0d-1968-4f65-b7ed-6d4acbde81c9" containerName="dnsmasq-dns" Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.872519 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dcd23fe-79d2-435b-be8d-5bf26404ad0a" containerName="keystone-bootstrap" Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.872619 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="a30a5486-cc7c-44be-bad6-5491a8003956" containerName="dnsmasq-dns" Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.873403 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xpvd9" Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.875852 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.875928 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.875852 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.877363 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.878091 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jd99d" Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.884214 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xpvd9"] Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.986340 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-fernet-keys\") pod \"keystone-bootstrap-xpvd9\" (UID: \"7fea3a37-ff10-43ff-ace6-f79041e5617f\") " pod="openstack/keystone-bootstrap-xpvd9" Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.986678 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-combined-ca-bundle\") pod \"keystone-bootstrap-xpvd9\" (UID: \"7fea3a37-ff10-43ff-ace6-f79041e5617f\") " pod="openstack/keystone-bootstrap-xpvd9" Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.986743 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-credential-keys\") pod \"keystone-bootstrap-xpvd9\" (UID: \"7fea3a37-ff10-43ff-ace6-f79041e5617f\") " pod="openstack/keystone-bootstrap-xpvd9" Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.986827 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl4bl\" (UniqueName: \"kubernetes.io/projected/7fea3a37-ff10-43ff-ace6-f79041e5617f-kube-api-access-xl4bl\") pod \"keystone-bootstrap-xpvd9\" (UID: \"7fea3a37-ff10-43ff-ace6-f79041e5617f\") " pod="openstack/keystone-bootstrap-xpvd9" Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.986871 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-config-data\") pod \"keystone-bootstrap-xpvd9\" (UID: \"7fea3a37-ff10-43ff-ace6-f79041e5617f\") " pod="openstack/keystone-bootstrap-xpvd9" Dec 01 20:17:58 crc kubenswrapper[4802]: I1201 20:17:58.986940 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-scripts\") pod \"keystone-bootstrap-xpvd9\" (UID: \"7fea3a37-ff10-43ff-ace6-f79041e5617f\") " pod="openstack/keystone-bootstrap-xpvd9" Dec 01 20:17:59 crc kubenswrapper[4802]: I1201 20:17:59.088732 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-credential-keys\") pod \"keystone-bootstrap-xpvd9\" (UID: \"7fea3a37-ff10-43ff-ace6-f79041e5617f\") " pod="openstack/keystone-bootstrap-xpvd9" Dec 01 20:17:59 crc kubenswrapper[4802]: I1201 20:17:59.088823 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl4bl\" (UniqueName: \"kubernetes.io/projected/7fea3a37-ff10-43ff-ace6-f79041e5617f-kube-api-access-xl4bl\") pod \"keystone-bootstrap-xpvd9\" (UID: \"7fea3a37-ff10-43ff-ace6-f79041e5617f\") " pod="openstack/keystone-bootstrap-xpvd9" Dec 01 20:17:59 crc kubenswrapper[4802]: I1201 20:17:59.088847 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-config-data\") pod \"keystone-bootstrap-xpvd9\" (UID: \"7fea3a37-ff10-43ff-ace6-f79041e5617f\") " pod="openstack/keystone-bootstrap-xpvd9" Dec 01 20:17:59 crc kubenswrapper[4802]: I1201 20:17:59.088885 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-scripts\") pod \"keystone-bootstrap-xpvd9\" (UID: \"7fea3a37-ff10-43ff-ace6-f79041e5617f\") " pod="openstack/keystone-bootstrap-xpvd9" Dec 01 20:17:59 crc kubenswrapper[4802]: I1201 20:17:59.088965 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-fernet-keys\") pod \"keystone-bootstrap-xpvd9\" (UID: \"7fea3a37-ff10-43ff-ace6-f79041e5617f\") " pod="openstack/keystone-bootstrap-xpvd9" Dec 01 20:17:59 crc kubenswrapper[4802]: I1201 20:17:59.088986 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-combined-ca-bundle\") pod \"keystone-bootstrap-xpvd9\" (UID: \"7fea3a37-ff10-43ff-ace6-f79041e5617f\") " pod="openstack/keystone-bootstrap-xpvd9" Dec 01 20:17:59 crc kubenswrapper[4802]: I1201 20:17:59.094625 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-config-data\") pod \"keystone-bootstrap-xpvd9\" (UID: \"7fea3a37-ff10-43ff-ace6-f79041e5617f\") " pod="openstack/keystone-bootstrap-xpvd9" Dec 01 20:17:59 crc kubenswrapper[4802]: I1201 20:17:59.095059 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-fernet-keys\") pod \"keystone-bootstrap-xpvd9\" (UID: \"7fea3a37-ff10-43ff-ace6-f79041e5617f\") " pod="openstack/keystone-bootstrap-xpvd9" Dec 01 20:17:59 crc kubenswrapper[4802]: I1201 20:17:59.095496 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-credential-keys\") pod \"keystone-bootstrap-xpvd9\" (UID: \"7fea3a37-ff10-43ff-ace6-f79041e5617f\") " pod="openstack/keystone-bootstrap-xpvd9" Dec 01 20:17:59 crc kubenswrapper[4802]: I1201 20:17:59.095499 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-scripts\") pod \"keystone-bootstrap-xpvd9\" (UID: \"7fea3a37-ff10-43ff-ace6-f79041e5617f\") " pod="openstack/keystone-bootstrap-xpvd9" Dec 01 20:17:59 crc kubenswrapper[4802]: I1201 20:17:59.095801 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-combined-ca-bundle\") pod \"keystone-bootstrap-xpvd9\" (UID: \"7fea3a37-ff10-43ff-ace6-f79041e5617f\") " pod="openstack/keystone-bootstrap-xpvd9" Dec 01 20:17:59 crc kubenswrapper[4802]: I1201 20:17:59.107719 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl4bl\" (UniqueName: \"kubernetes.io/projected/7fea3a37-ff10-43ff-ace6-f79041e5617f-kube-api-access-xl4bl\") pod \"keystone-bootstrap-xpvd9\" (UID: \"7fea3a37-ff10-43ff-ace6-f79041e5617f\") " pod="openstack/keystone-bootstrap-xpvd9" Dec 01 20:17:59 crc kubenswrapper[4802]: I1201 20:17:59.199884 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xpvd9" Dec 01 20:17:59 crc kubenswrapper[4802]: I1201 20:17:59.329835 4802 scope.go:117] "RemoveContainer" containerID="f7c0d46599dfdc92d7ba66dfcaa0ecd5c465ab93ef2bbce2390a7d6f4124b832" Dec 01 20:17:59 crc kubenswrapper[4802]: E1201 20:17:59.411164 4802 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 01 20:17:59 crc kubenswrapper[4802]: E1201 20:17:59.411611 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5gzzp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-p259k_openstack(93f6c930-0ed7-480e-8725-692427ba2b9d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 20:17:59 crc kubenswrapper[4802]: E1201 20:17:59.413084 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-p259k" podUID="93f6c930-0ed7-480e-8725-692427ba2b9d" Dec 01 20:17:59 crc kubenswrapper[4802]: I1201 20:17:59.812371 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xpvd9"] Dec 01 20:18:00 crc kubenswrapper[4802]: I1201 20:18:00.130982 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xpvd9" event={"ID":"7fea3a37-ff10-43ff-ace6-f79041e5617f","Type":"ContainerStarted","Data":"6d42a16ae8e21143fd39acba472b9547ba784f977d607107345fafa541dd575a"} Dec 01 20:18:00 crc kubenswrapper[4802]: I1201 20:18:00.131278 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xpvd9" event={"ID":"7fea3a37-ff10-43ff-ace6-f79041e5617f","Type":"ContainerStarted","Data":"40555949d5bc0335f491a11f89bc94b5ad0959129a2a8d31f45960bb81e2261d"} Dec 01 20:18:00 crc kubenswrapper[4802]: I1201 20:18:00.135003 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1aa8400d-0d21-46be-8bb9-3dd4996d8500","Type":"ContainerStarted","Data":"556aee86c21bf26b6292bcec45f1ef50a44079ed4a4b7beb98139c3053279302"} Dec 01 20:18:00 crc kubenswrapper[4802]: E1201 20:18:00.136082 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-p259k" podUID="93f6c930-0ed7-480e-8725-692427ba2b9d" Dec 01 20:18:00 crc kubenswrapper[4802]: I1201 20:18:00.153823 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xpvd9" podStartSLOduration=2.153807297 podStartE2EDuration="2.153807297s" podCreationTimestamp="2025-12-01 20:17:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:18:00.151953699 +0000 UTC m=+1301.714513360" watchObservedRunningTime="2025-12-01 20:18:00.153807297 +0000 UTC m=+1301.716366938" Dec 01 20:18:00 crc kubenswrapper[4802]: I1201 20:18:00.732384 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dcd23fe-79d2-435b-be8d-5bf26404ad0a" path="/var/lib/kubelet/pods/5dcd23fe-79d2-435b-be8d-5bf26404ad0a/volumes" Dec 01 20:18:04 crc kubenswrapper[4802]: I1201 20:18:04.182119 4802 generic.go:334] "Generic (PLEG): container finished" podID="7fea3a37-ff10-43ff-ace6-f79041e5617f" containerID="6d42a16ae8e21143fd39acba472b9547ba784f977d607107345fafa541dd575a" exitCode=0 Dec 01 20:18:04 crc kubenswrapper[4802]: I1201 20:18:04.182681 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xpvd9" event={"ID":"7fea3a37-ff10-43ff-ace6-f79041e5617f","Type":"ContainerDied","Data":"6d42a16ae8e21143fd39acba472b9547ba784f977d607107345fafa541dd575a"} Dec 01 20:18:05 crc kubenswrapper[4802]: I1201 20:18:05.198334 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xkpk5" event={"ID":"8b1801b5-50f2-40fc-9e14-386216a4418c","Type":"ContainerStarted","Data":"cc9037216683e5e4d67c82eb3b1440096e90685b5a00073794fd433b4b23a47c"} Dec 01 20:18:05 crc kubenswrapper[4802]: I1201 20:18:05.202808 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1aa8400d-0d21-46be-8bb9-3dd4996d8500","Type":"ContainerStarted","Data":"883be8e9c0bedf2001fa38af132bce7612ab18e4da2a4baf9a2a11d2a6868245"} Dec 01 20:18:05 crc kubenswrapper[4802]: I1201 20:18:05.501401 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xpvd9" Dec 01 20:18:05 crc kubenswrapper[4802]: I1201 20:18:05.606965 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-scripts\") pod \"7fea3a37-ff10-43ff-ace6-f79041e5617f\" (UID: \"7fea3a37-ff10-43ff-ace6-f79041e5617f\") " Dec 01 20:18:05 crc kubenswrapper[4802]: I1201 20:18:05.607092 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-fernet-keys\") pod \"7fea3a37-ff10-43ff-ace6-f79041e5617f\" (UID: \"7fea3a37-ff10-43ff-ace6-f79041e5617f\") " Dec 01 20:18:05 crc kubenswrapper[4802]: I1201 20:18:05.607228 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-combined-ca-bundle\") pod \"7fea3a37-ff10-43ff-ace6-f79041e5617f\" (UID: \"7fea3a37-ff10-43ff-ace6-f79041e5617f\") " Dec 01 20:18:05 crc kubenswrapper[4802]: I1201 20:18:05.607325 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-credential-keys\") pod \"7fea3a37-ff10-43ff-ace6-f79041e5617f\" (UID: \"7fea3a37-ff10-43ff-ace6-f79041e5617f\") " Dec 01 20:18:05 crc kubenswrapper[4802]: I1201 20:18:05.607352 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-config-data\") pod \"7fea3a37-ff10-43ff-ace6-f79041e5617f\" (UID: \"7fea3a37-ff10-43ff-ace6-f79041e5617f\") " Dec 01 20:18:05 crc kubenswrapper[4802]: I1201 20:18:05.607412 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl4bl\" (UniqueName: \"kubernetes.io/projected/7fea3a37-ff10-43ff-ace6-f79041e5617f-kube-api-access-xl4bl\") pod \"7fea3a37-ff10-43ff-ace6-f79041e5617f\" (UID: \"7fea3a37-ff10-43ff-ace6-f79041e5617f\") " Dec 01 20:18:05 crc kubenswrapper[4802]: I1201 20:18:05.612452 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7fea3a37-ff10-43ff-ace6-f79041e5617f" (UID: "7fea3a37-ff10-43ff-ace6-f79041e5617f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:05 crc kubenswrapper[4802]: I1201 20:18:05.612511 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fea3a37-ff10-43ff-ace6-f79041e5617f-kube-api-access-xl4bl" (OuterVolumeSpecName: "kube-api-access-xl4bl") pod "7fea3a37-ff10-43ff-ace6-f79041e5617f" (UID: "7fea3a37-ff10-43ff-ace6-f79041e5617f"). InnerVolumeSpecName "kube-api-access-xl4bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:18:05 crc kubenswrapper[4802]: I1201 20:18:05.612714 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7fea3a37-ff10-43ff-ace6-f79041e5617f" (UID: "7fea3a37-ff10-43ff-ace6-f79041e5617f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:05 crc kubenswrapper[4802]: I1201 20:18:05.613675 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-scripts" (OuterVolumeSpecName: "scripts") pod "7fea3a37-ff10-43ff-ace6-f79041e5617f" (UID: "7fea3a37-ff10-43ff-ace6-f79041e5617f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:05 crc kubenswrapper[4802]: I1201 20:18:05.631935 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7fea3a37-ff10-43ff-ace6-f79041e5617f" (UID: "7fea3a37-ff10-43ff-ace6-f79041e5617f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:05 crc kubenswrapper[4802]: I1201 20:18:05.634153 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-config-data" (OuterVolumeSpecName: "config-data") pod "7fea3a37-ff10-43ff-ace6-f79041e5617f" (UID: "7fea3a37-ff10-43ff-ace6-f79041e5617f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:05 crc kubenswrapper[4802]: I1201 20:18:05.709405 4802 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:05 crc kubenswrapper[4802]: I1201 20:18:05.709446 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:05 crc kubenswrapper[4802]: I1201 20:18:05.709468 4802 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:05 crc kubenswrapper[4802]: I1201 20:18:05.709483 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:05 crc kubenswrapper[4802]: I1201 20:18:05.709495 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl4bl\" (UniqueName: \"kubernetes.io/projected/7fea3a37-ff10-43ff-ace6-f79041e5617f-kube-api-access-xl4bl\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:05 crc kubenswrapper[4802]: I1201 20:18:05.709508 4802 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fea3a37-ff10-43ff-ace6-f79041e5617f-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.221523 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xpvd9" event={"ID":"7fea3a37-ff10-43ff-ace6-f79041e5617f","Type":"ContainerDied","Data":"40555949d5bc0335f491a11f89bc94b5ad0959129a2a8d31f45960bb81e2261d"} Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.221592 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40555949d5bc0335f491a11f89bc94b5ad0959129a2a8d31f45960bb81e2261d" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.221595 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xpvd9" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.260426 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-xkpk5" podStartSLOduration=3.6856059119999998 podStartE2EDuration="38.260408715s" podCreationTimestamp="2025-12-01 20:17:28 +0000 UTC" firstStartedPulling="2025-12-01 20:17:30.183086302 +0000 UTC m=+1271.745645943" lastFinishedPulling="2025-12-01 20:18:04.757889095 +0000 UTC m=+1306.320448746" observedRunningTime="2025-12-01 20:18:06.258474984 +0000 UTC m=+1307.821034645" watchObservedRunningTime="2025-12-01 20:18:06.260408715 +0000 UTC m=+1307.822968356" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.305577 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7977dfdfb6-dnr99"] Dec 01 20:18:06 crc kubenswrapper[4802]: E1201 20:18:06.305955 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fea3a37-ff10-43ff-ace6-f79041e5617f" containerName="keystone-bootstrap" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.305978 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fea3a37-ff10-43ff-ace6-f79041e5617f" containerName="keystone-bootstrap" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.306184 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fea3a37-ff10-43ff-ace6-f79041e5617f" containerName="keystone-bootstrap" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.306861 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7977dfdfb6-dnr99" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.310398 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.310557 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.310806 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.311262 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jd99d" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.315531 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.315599 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.336456 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7977dfdfb6-dnr99"] Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.422302 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f78fa699-c933-4160-b4dc-5b3db575ac17-credential-keys\") pod \"keystone-7977dfdfb6-dnr99\" (UID: \"f78fa699-c933-4160-b4dc-5b3db575ac17\") " pod="openstack/keystone-7977dfdfb6-dnr99" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.422720 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f78fa699-c933-4160-b4dc-5b3db575ac17-internal-tls-certs\") pod \"keystone-7977dfdfb6-dnr99\" (UID: \"f78fa699-c933-4160-b4dc-5b3db575ac17\") " pod="openstack/keystone-7977dfdfb6-dnr99" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.422881 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn226\" (UniqueName: \"kubernetes.io/projected/f78fa699-c933-4160-b4dc-5b3db575ac17-kube-api-access-bn226\") pod \"keystone-7977dfdfb6-dnr99\" (UID: \"f78fa699-c933-4160-b4dc-5b3db575ac17\") " pod="openstack/keystone-7977dfdfb6-dnr99" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.423030 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f78fa699-c933-4160-b4dc-5b3db575ac17-fernet-keys\") pod \"keystone-7977dfdfb6-dnr99\" (UID: \"f78fa699-c933-4160-b4dc-5b3db575ac17\") " pod="openstack/keystone-7977dfdfb6-dnr99" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.423233 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f78fa699-c933-4160-b4dc-5b3db575ac17-scripts\") pod \"keystone-7977dfdfb6-dnr99\" (UID: \"f78fa699-c933-4160-b4dc-5b3db575ac17\") " pod="openstack/keystone-7977dfdfb6-dnr99" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.423358 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78fa699-c933-4160-b4dc-5b3db575ac17-combined-ca-bundle\") pod \"keystone-7977dfdfb6-dnr99\" (UID: \"f78fa699-c933-4160-b4dc-5b3db575ac17\") " pod="openstack/keystone-7977dfdfb6-dnr99" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.424268 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f78fa699-c933-4160-b4dc-5b3db575ac17-public-tls-certs\") pod \"keystone-7977dfdfb6-dnr99\" (UID: \"f78fa699-c933-4160-b4dc-5b3db575ac17\") " pod="openstack/keystone-7977dfdfb6-dnr99" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.424483 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f78fa699-c933-4160-b4dc-5b3db575ac17-config-data\") pod \"keystone-7977dfdfb6-dnr99\" (UID: \"f78fa699-c933-4160-b4dc-5b3db575ac17\") " pod="openstack/keystone-7977dfdfb6-dnr99" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.525906 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f78fa699-c933-4160-b4dc-5b3db575ac17-config-data\") pod \"keystone-7977dfdfb6-dnr99\" (UID: \"f78fa699-c933-4160-b4dc-5b3db575ac17\") " pod="openstack/keystone-7977dfdfb6-dnr99" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.526183 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f78fa699-c933-4160-b4dc-5b3db575ac17-credential-keys\") pod \"keystone-7977dfdfb6-dnr99\" (UID: \"f78fa699-c933-4160-b4dc-5b3db575ac17\") " pod="openstack/keystone-7977dfdfb6-dnr99" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.526272 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f78fa699-c933-4160-b4dc-5b3db575ac17-internal-tls-certs\") pod \"keystone-7977dfdfb6-dnr99\" (UID: \"f78fa699-c933-4160-b4dc-5b3db575ac17\") " pod="openstack/keystone-7977dfdfb6-dnr99" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.526295 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn226\" (UniqueName: \"kubernetes.io/projected/f78fa699-c933-4160-b4dc-5b3db575ac17-kube-api-access-bn226\") pod \"keystone-7977dfdfb6-dnr99\" (UID: \"f78fa699-c933-4160-b4dc-5b3db575ac17\") " pod="openstack/keystone-7977dfdfb6-dnr99" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.526322 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f78fa699-c933-4160-b4dc-5b3db575ac17-fernet-keys\") pod \"keystone-7977dfdfb6-dnr99\" (UID: \"f78fa699-c933-4160-b4dc-5b3db575ac17\") " pod="openstack/keystone-7977dfdfb6-dnr99" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.526347 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f78fa699-c933-4160-b4dc-5b3db575ac17-scripts\") pod \"keystone-7977dfdfb6-dnr99\" (UID: \"f78fa699-c933-4160-b4dc-5b3db575ac17\") " pod="openstack/keystone-7977dfdfb6-dnr99" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.526364 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78fa699-c933-4160-b4dc-5b3db575ac17-combined-ca-bundle\") pod \"keystone-7977dfdfb6-dnr99\" (UID: \"f78fa699-c933-4160-b4dc-5b3db575ac17\") " pod="openstack/keystone-7977dfdfb6-dnr99" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.526387 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f78fa699-c933-4160-b4dc-5b3db575ac17-public-tls-certs\") pod \"keystone-7977dfdfb6-dnr99\" (UID: \"f78fa699-c933-4160-b4dc-5b3db575ac17\") " pod="openstack/keystone-7977dfdfb6-dnr99" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.534274 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f78fa699-c933-4160-b4dc-5b3db575ac17-config-data\") pod \"keystone-7977dfdfb6-dnr99\" (UID: \"f78fa699-c933-4160-b4dc-5b3db575ac17\") " pod="openstack/keystone-7977dfdfb6-dnr99" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.534734 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f78fa699-c933-4160-b4dc-5b3db575ac17-scripts\") pod \"keystone-7977dfdfb6-dnr99\" (UID: \"f78fa699-c933-4160-b4dc-5b3db575ac17\") " pod="openstack/keystone-7977dfdfb6-dnr99" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.536330 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f78fa699-c933-4160-b4dc-5b3db575ac17-fernet-keys\") pod \"keystone-7977dfdfb6-dnr99\" (UID: \"f78fa699-c933-4160-b4dc-5b3db575ac17\") " pod="openstack/keystone-7977dfdfb6-dnr99" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.536592 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78fa699-c933-4160-b4dc-5b3db575ac17-combined-ca-bundle\") pod \"keystone-7977dfdfb6-dnr99\" (UID: \"f78fa699-c933-4160-b4dc-5b3db575ac17\") " pod="openstack/keystone-7977dfdfb6-dnr99" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.536971 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f78fa699-c933-4160-b4dc-5b3db575ac17-credential-keys\") pod \"keystone-7977dfdfb6-dnr99\" (UID: \"f78fa699-c933-4160-b4dc-5b3db575ac17\") " pod="openstack/keystone-7977dfdfb6-dnr99" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.540753 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f78fa699-c933-4160-b4dc-5b3db575ac17-public-tls-certs\") pod \"keystone-7977dfdfb6-dnr99\" (UID: \"f78fa699-c933-4160-b4dc-5b3db575ac17\") " pod="openstack/keystone-7977dfdfb6-dnr99" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.548363 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f78fa699-c933-4160-b4dc-5b3db575ac17-internal-tls-certs\") pod \"keystone-7977dfdfb6-dnr99\" (UID: \"f78fa699-c933-4160-b4dc-5b3db575ac17\") " pod="openstack/keystone-7977dfdfb6-dnr99" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.557805 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn226\" (UniqueName: \"kubernetes.io/projected/f78fa699-c933-4160-b4dc-5b3db575ac17-kube-api-access-bn226\") pod \"keystone-7977dfdfb6-dnr99\" (UID: \"f78fa699-c933-4160-b4dc-5b3db575ac17\") " pod="openstack/keystone-7977dfdfb6-dnr99" Dec 01 20:18:06 crc kubenswrapper[4802]: I1201 20:18:06.626625 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7977dfdfb6-dnr99" Dec 01 20:18:07 crc kubenswrapper[4802]: I1201 20:18:07.256846 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7977dfdfb6-dnr99"] Dec 01 20:18:08 crc kubenswrapper[4802]: I1201 20:18:08.240721 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7977dfdfb6-dnr99" event={"ID":"f78fa699-c933-4160-b4dc-5b3db575ac17","Type":"ContainerStarted","Data":"b7d31acd34333796c076bcbea331a5456b0581ec64e72ec1779dbbabb8e7a115"} Dec 01 20:18:08 crc kubenswrapper[4802]: I1201 20:18:08.241101 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7977dfdfb6-dnr99" event={"ID":"f78fa699-c933-4160-b4dc-5b3db575ac17","Type":"ContainerStarted","Data":"195f472c8e4600cd179c4a70dcad477fc31e4808454daff22263007bf7bd8e85"} Dec 01 20:18:08 crc kubenswrapper[4802]: I1201 20:18:08.241137 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7977dfdfb6-dnr99" Dec 01 20:18:08 crc kubenswrapper[4802]: I1201 20:18:08.244109 4802 generic.go:334] "Generic (PLEG): container finished" podID="8b1801b5-50f2-40fc-9e14-386216a4418c" containerID="cc9037216683e5e4d67c82eb3b1440096e90685b5a00073794fd433b4b23a47c" exitCode=0 Dec 01 20:18:08 crc kubenswrapper[4802]: I1201 20:18:08.244145 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xkpk5" event={"ID":"8b1801b5-50f2-40fc-9e14-386216a4418c","Type":"ContainerDied","Data":"cc9037216683e5e4d67c82eb3b1440096e90685b5a00073794fd433b4b23a47c"} Dec 01 20:18:08 crc kubenswrapper[4802]: I1201 20:18:08.277393 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7977dfdfb6-dnr99" podStartSLOduration=2.277370734 podStartE2EDuration="2.277370734s" podCreationTimestamp="2025-12-01 20:18:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:18:08.264228342 +0000 UTC m=+1309.826788193" watchObservedRunningTime="2025-12-01 20:18:08.277370734 +0000 UTC m=+1309.839930375" Dec 01 20:18:12 crc kubenswrapper[4802]: I1201 20:18:12.055494 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xkpk5" Dec 01 20:18:12 crc kubenswrapper[4802]: I1201 20:18:12.136569 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b1801b5-50f2-40fc-9e14-386216a4418c-scripts\") pod \"8b1801b5-50f2-40fc-9e14-386216a4418c\" (UID: \"8b1801b5-50f2-40fc-9e14-386216a4418c\") " Dec 01 20:18:12 crc kubenswrapper[4802]: I1201 20:18:12.136665 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b1801b5-50f2-40fc-9e14-386216a4418c-config-data\") pod \"8b1801b5-50f2-40fc-9e14-386216a4418c\" (UID: \"8b1801b5-50f2-40fc-9e14-386216a4418c\") " Dec 01 20:18:12 crc kubenswrapper[4802]: I1201 20:18:12.136732 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b1801b5-50f2-40fc-9e14-386216a4418c-combined-ca-bundle\") pod \"8b1801b5-50f2-40fc-9e14-386216a4418c\" (UID: \"8b1801b5-50f2-40fc-9e14-386216a4418c\") " Dec 01 20:18:12 crc kubenswrapper[4802]: I1201 20:18:12.136775 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nm2sl\" (UniqueName: \"kubernetes.io/projected/8b1801b5-50f2-40fc-9e14-386216a4418c-kube-api-access-nm2sl\") pod \"8b1801b5-50f2-40fc-9e14-386216a4418c\" (UID: \"8b1801b5-50f2-40fc-9e14-386216a4418c\") " Dec 01 20:18:12 crc kubenswrapper[4802]: I1201 20:18:12.136840 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b1801b5-50f2-40fc-9e14-386216a4418c-logs\") pod \"8b1801b5-50f2-40fc-9e14-386216a4418c\" (UID: \"8b1801b5-50f2-40fc-9e14-386216a4418c\") " Dec 01 20:18:12 crc kubenswrapper[4802]: I1201 20:18:12.137550 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b1801b5-50f2-40fc-9e14-386216a4418c-logs" (OuterVolumeSpecName: "logs") pod "8b1801b5-50f2-40fc-9e14-386216a4418c" (UID: "8b1801b5-50f2-40fc-9e14-386216a4418c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:18:12 crc kubenswrapper[4802]: I1201 20:18:12.142878 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b1801b5-50f2-40fc-9e14-386216a4418c-kube-api-access-nm2sl" (OuterVolumeSpecName: "kube-api-access-nm2sl") pod "8b1801b5-50f2-40fc-9e14-386216a4418c" (UID: "8b1801b5-50f2-40fc-9e14-386216a4418c"). InnerVolumeSpecName "kube-api-access-nm2sl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:18:12 crc kubenswrapper[4802]: I1201 20:18:12.145999 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b1801b5-50f2-40fc-9e14-386216a4418c-scripts" (OuterVolumeSpecName: "scripts") pod "8b1801b5-50f2-40fc-9e14-386216a4418c" (UID: "8b1801b5-50f2-40fc-9e14-386216a4418c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:12 crc kubenswrapper[4802]: I1201 20:18:12.177378 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b1801b5-50f2-40fc-9e14-386216a4418c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b1801b5-50f2-40fc-9e14-386216a4418c" (UID: "8b1801b5-50f2-40fc-9e14-386216a4418c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:12 crc kubenswrapper[4802]: I1201 20:18:12.210459 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b1801b5-50f2-40fc-9e14-386216a4418c-config-data" (OuterVolumeSpecName: "config-data") pod "8b1801b5-50f2-40fc-9e14-386216a4418c" (UID: "8b1801b5-50f2-40fc-9e14-386216a4418c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:12 crc kubenswrapper[4802]: I1201 20:18:12.240294 4802 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b1801b5-50f2-40fc-9e14-386216a4418c-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:12 crc kubenswrapper[4802]: I1201 20:18:12.240349 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b1801b5-50f2-40fc-9e14-386216a4418c-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:12 crc kubenswrapper[4802]: I1201 20:18:12.240363 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b1801b5-50f2-40fc-9e14-386216a4418c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:12 crc kubenswrapper[4802]: I1201 20:18:12.240379 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nm2sl\" (UniqueName: \"kubernetes.io/projected/8b1801b5-50f2-40fc-9e14-386216a4418c-kube-api-access-nm2sl\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:12 crc kubenswrapper[4802]: I1201 20:18:12.240412 4802 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b1801b5-50f2-40fc-9e14-386216a4418c-logs\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:12 crc kubenswrapper[4802]: I1201 20:18:12.301943 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xkpk5" event={"ID":"8b1801b5-50f2-40fc-9e14-386216a4418c","Type":"ContainerDied","Data":"8eed099f81b4c4011037b04734b2f27be0b51764b2bfa395f4cb3863804b8c94"} Dec 01 20:18:12 crc kubenswrapper[4802]: I1201 20:18:12.301990 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8eed099f81b4c4011037b04734b2f27be0b51764b2bfa395f4cb3863804b8c94" Dec 01 20:18:12 crc kubenswrapper[4802]: I1201 20:18:12.302059 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xkpk5" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.197608 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-646fbc85dd-2ttbm"] Dec 01 20:18:13 crc kubenswrapper[4802]: E1201 20:18:13.198074 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b1801b5-50f2-40fc-9e14-386216a4418c" containerName="placement-db-sync" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.198096 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b1801b5-50f2-40fc-9e14-386216a4418c" containerName="placement-db-sync" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.198325 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b1801b5-50f2-40fc-9e14-386216a4418c" containerName="placement-db-sync" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.199447 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-646fbc85dd-2ttbm" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.206995 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-646fbc85dd-2ttbm"] Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.211700 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.212005 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ktp48" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.211963 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.211987 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.212012 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.314034 4802 generic.go:334] "Generic (PLEG): container finished" podID="1f0b1428-3468-4e47-939d-8614d302bd75" containerID="b960e6fc96accaaf52672c959daf9ccdcd9204d9deb6a549e9947c5e12659930" exitCode=0 Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.314085 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-xcrnm" event={"ID":"1f0b1428-3468-4e47-939d-8614d302bd75","Type":"ContainerDied","Data":"b960e6fc96accaaf52672c959daf9ccdcd9204d9deb6a549e9947c5e12659930"} Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.357800 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3853723d-4452-4112-99bf-c3850a983f5d-combined-ca-bundle\") pod \"placement-646fbc85dd-2ttbm\" (UID: \"3853723d-4452-4112-99bf-c3850a983f5d\") " pod="openstack/placement-646fbc85dd-2ttbm" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.357866 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3853723d-4452-4112-99bf-c3850a983f5d-scripts\") pod \"placement-646fbc85dd-2ttbm\" (UID: \"3853723d-4452-4112-99bf-c3850a983f5d\") " pod="openstack/placement-646fbc85dd-2ttbm" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.358225 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3853723d-4452-4112-99bf-c3850a983f5d-config-data\") pod \"placement-646fbc85dd-2ttbm\" (UID: \"3853723d-4452-4112-99bf-c3850a983f5d\") " pod="openstack/placement-646fbc85dd-2ttbm" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.358336 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3853723d-4452-4112-99bf-c3850a983f5d-public-tls-certs\") pod \"placement-646fbc85dd-2ttbm\" (UID: \"3853723d-4452-4112-99bf-c3850a983f5d\") " pod="openstack/placement-646fbc85dd-2ttbm" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.358422 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8xg5\" (UniqueName: \"kubernetes.io/projected/3853723d-4452-4112-99bf-c3850a983f5d-kube-api-access-t8xg5\") pod \"placement-646fbc85dd-2ttbm\" (UID: \"3853723d-4452-4112-99bf-c3850a983f5d\") " pod="openstack/placement-646fbc85dd-2ttbm" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.358457 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3853723d-4452-4112-99bf-c3850a983f5d-logs\") pod \"placement-646fbc85dd-2ttbm\" (UID: \"3853723d-4452-4112-99bf-c3850a983f5d\") " pod="openstack/placement-646fbc85dd-2ttbm" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.358484 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3853723d-4452-4112-99bf-c3850a983f5d-internal-tls-certs\") pod \"placement-646fbc85dd-2ttbm\" (UID: \"3853723d-4452-4112-99bf-c3850a983f5d\") " pod="openstack/placement-646fbc85dd-2ttbm" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.460493 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3853723d-4452-4112-99bf-c3850a983f5d-config-data\") pod \"placement-646fbc85dd-2ttbm\" (UID: \"3853723d-4452-4112-99bf-c3850a983f5d\") " pod="openstack/placement-646fbc85dd-2ttbm" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.460565 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3853723d-4452-4112-99bf-c3850a983f5d-public-tls-certs\") pod \"placement-646fbc85dd-2ttbm\" (UID: \"3853723d-4452-4112-99bf-c3850a983f5d\") " pod="openstack/placement-646fbc85dd-2ttbm" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.460602 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8xg5\" (UniqueName: \"kubernetes.io/projected/3853723d-4452-4112-99bf-c3850a983f5d-kube-api-access-t8xg5\") pod \"placement-646fbc85dd-2ttbm\" (UID: \"3853723d-4452-4112-99bf-c3850a983f5d\") " pod="openstack/placement-646fbc85dd-2ttbm" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.460621 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3853723d-4452-4112-99bf-c3850a983f5d-logs\") pod \"placement-646fbc85dd-2ttbm\" (UID: \"3853723d-4452-4112-99bf-c3850a983f5d\") " pod="openstack/placement-646fbc85dd-2ttbm" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.460642 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3853723d-4452-4112-99bf-c3850a983f5d-internal-tls-certs\") pod \"placement-646fbc85dd-2ttbm\" (UID: \"3853723d-4452-4112-99bf-c3850a983f5d\") " pod="openstack/placement-646fbc85dd-2ttbm" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.460668 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3853723d-4452-4112-99bf-c3850a983f5d-combined-ca-bundle\") pod \"placement-646fbc85dd-2ttbm\" (UID: \"3853723d-4452-4112-99bf-c3850a983f5d\") " pod="openstack/placement-646fbc85dd-2ttbm" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.460692 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3853723d-4452-4112-99bf-c3850a983f5d-scripts\") pod \"placement-646fbc85dd-2ttbm\" (UID: \"3853723d-4452-4112-99bf-c3850a983f5d\") " pod="openstack/placement-646fbc85dd-2ttbm" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.461308 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3853723d-4452-4112-99bf-c3850a983f5d-logs\") pod \"placement-646fbc85dd-2ttbm\" (UID: \"3853723d-4452-4112-99bf-c3850a983f5d\") " pod="openstack/placement-646fbc85dd-2ttbm" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.467298 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3853723d-4452-4112-99bf-c3850a983f5d-scripts\") pod \"placement-646fbc85dd-2ttbm\" (UID: \"3853723d-4452-4112-99bf-c3850a983f5d\") " pod="openstack/placement-646fbc85dd-2ttbm" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.467370 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3853723d-4452-4112-99bf-c3850a983f5d-internal-tls-certs\") pod \"placement-646fbc85dd-2ttbm\" (UID: \"3853723d-4452-4112-99bf-c3850a983f5d\") " pod="openstack/placement-646fbc85dd-2ttbm" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.468427 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3853723d-4452-4112-99bf-c3850a983f5d-combined-ca-bundle\") pod \"placement-646fbc85dd-2ttbm\" (UID: \"3853723d-4452-4112-99bf-c3850a983f5d\") " pod="openstack/placement-646fbc85dd-2ttbm" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.480174 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3853723d-4452-4112-99bf-c3850a983f5d-config-data\") pod \"placement-646fbc85dd-2ttbm\" (UID: \"3853723d-4452-4112-99bf-c3850a983f5d\") " pod="openstack/placement-646fbc85dd-2ttbm" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.482153 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3853723d-4452-4112-99bf-c3850a983f5d-public-tls-certs\") pod \"placement-646fbc85dd-2ttbm\" (UID: \"3853723d-4452-4112-99bf-c3850a983f5d\") " pod="openstack/placement-646fbc85dd-2ttbm" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.483671 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8xg5\" (UniqueName: \"kubernetes.io/projected/3853723d-4452-4112-99bf-c3850a983f5d-kube-api-access-t8xg5\") pod \"placement-646fbc85dd-2ttbm\" (UID: \"3853723d-4452-4112-99bf-c3850a983f5d\") " pod="openstack/placement-646fbc85dd-2ttbm" Dec 01 20:18:13 crc kubenswrapper[4802]: I1201 20:18:13.529968 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-646fbc85dd-2ttbm" Dec 01 20:18:13 crc kubenswrapper[4802]: E1201 20:18:13.983497 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="1aa8400d-0d21-46be-8bb9-3dd4996d8500" Dec 01 20:18:14 crc kubenswrapper[4802]: I1201 20:18:14.086171 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-646fbc85dd-2ttbm"] Dec 01 20:18:14 crc kubenswrapper[4802]: W1201 20:18:14.099214 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3853723d_4452_4112_99bf_c3850a983f5d.slice/crio-0677dd489bb2cec3b544b1d9db4becad6fcd35bfbc2df61c3465214b633fec96 WatchSource:0}: Error finding container 0677dd489bb2cec3b544b1d9db4becad6fcd35bfbc2df61c3465214b633fec96: Status 404 returned error can't find the container with id 0677dd489bb2cec3b544b1d9db4becad6fcd35bfbc2df61c3465214b633fec96 Dec 01 20:18:14 crc kubenswrapper[4802]: I1201 20:18:14.323081 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1aa8400d-0d21-46be-8bb9-3dd4996d8500","Type":"ContainerStarted","Data":"15128db726beff39f1169ffe2a39e3e64d485d46016f5c6895cca19404c5b2cd"} Dec 01 20:18:14 crc kubenswrapper[4802]: I1201 20:18:14.323320 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1aa8400d-0d21-46be-8bb9-3dd4996d8500" containerName="proxy-httpd" containerID="cri-o://15128db726beff39f1169ffe2a39e3e64d485d46016f5c6895cca19404c5b2cd" gracePeriod=30 Dec 01 20:18:14 crc kubenswrapper[4802]: I1201 20:18:14.323223 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1aa8400d-0d21-46be-8bb9-3dd4996d8500" containerName="ceilometer-notification-agent" containerID="cri-o://556aee86c21bf26b6292bcec45f1ef50a44079ed4a4b7beb98139c3053279302" gracePeriod=30 Dec 01 20:18:14 crc kubenswrapper[4802]: I1201 20:18:14.323319 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1aa8400d-0d21-46be-8bb9-3dd4996d8500" containerName="sg-core" containerID="cri-o://883be8e9c0bedf2001fa38af132bce7612ab18e4da2a4baf9a2a11d2a6868245" gracePeriod=30 Dec 01 20:18:14 crc kubenswrapper[4802]: I1201 20:18:14.323505 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 20:18:14 crc kubenswrapper[4802]: I1201 20:18:14.328533 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hkl2c" event={"ID":"4d086846-e371-492b-81c2-0fb443b14f30","Type":"ContainerStarted","Data":"4c2b475b9b0a9df89ec774654ac31deeee6d7364aaa82df14e1cfbcbc0456ff5"} Dec 01 20:18:14 crc kubenswrapper[4802]: I1201 20:18:14.330282 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-646fbc85dd-2ttbm" event={"ID":"3853723d-4452-4112-99bf-c3850a983f5d","Type":"ContainerStarted","Data":"c2329b5156d83b92733342fc2c48e60ae18b1243505344808c283ff4fa058b59"} Dec 01 20:18:14 crc kubenswrapper[4802]: I1201 20:18:14.330309 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-646fbc85dd-2ttbm" event={"ID":"3853723d-4452-4112-99bf-c3850a983f5d","Type":"ContainerStarted","Data":"0677dd489bb2cec3b544b1d9db4becad6fcd35bfbc2df61c3465214b633fec96"} Dec 01 20:18:14 crc kubenswrapper[4802]: I1201 20:18:14.371646 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-hkl2c" podStartSLOduration=3.046666812 podStartE2EDuration="46.371631034s" podCreationTimestamp="2025-12-01 20:17:28 +0000 UTC" firstStartedPulling="2025-12-01 20:17:30.31651202 +0000 UTC m=+1271.879071661" lastFinishedPulling="2025-12-01 20:18:13.641476242 +0000 UTC m=+1315.204035883" observedRunningTime="2025-12-01 20:18:14.362132786 +0000 UTC m=+1315.924692427" watchObservedRunningTime="2025-12-01 20:18:14.371631034 +0000 UTC m=+1315.934190675" Dec 01 20:18:14 crc kubenswrapper[4802]: I1201 20:18:14.666924 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-xcrnm" Dec 01 20:18:14 crc kubenswrapper[4802]: I1201 20:18:14.845289 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6q9ws\" (UniqueName: \"kubernetes.io/projected/1f0b1428-3468-4e47-939d-8614d302bd75-kube-api-access-6q9ws\") pod \"1f0b1428-3468-4e47-939d-8614d302bd75\" (UID: \"1f0b1428-3468-4e47-939d-8614d302bd75\") " Dec 01 20:18:14 crc kubenswrapper[4802]: I1201 20:18:14.845332 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f0b1428-3468-4e47-939d-8614d302bd75-combined-ca-bundle\") pod \"1f0b1428-3468-4e47-939d-8614d302bd75\" (UID: \"1f0b1428-3468-4e47-939d-8614d302bd75\") " Dec 01 20:18:14 crc kubenswrapper[4802]: I1201 20:18:14.845371 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f0b1428-3468-4e47-939d-8614d302bd75-config\") pod \"1f0b1428-3468-4e47-939d-8614d302bd75\" (UID: \"1f0b1428-3468-4e47-939d-8614d302bd75\") " Dec 01 20:18:14 crc kubenswrapper[4802]: I1201 20:18:14.851345 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f0b1428-3468-4e47-939d-8614d302bd75-kube-api-access-6q9ws" (OuterVolumeSpecName: "kube-api-access-6q9ws") pod "1f0b1428-3468-4e47-939d-8614d302bd75" (UID: "1f0b1428-3468-4e47-939d-8614d302bd75"). InnerVolumeSpecName "kube-api-access-6q9ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:18:14 crc kubenswrapper[4802]: I1201 20:18:14.877682 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f0b1428-3468-4e47-939d-8614d302bd75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f0b1428-3468-4e47-939d-8614d302bd75" (UID: "1f0b1428-3468-4e47-939d-8614d302bd75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:14 crc kubenswrapper[4802]: I1201 20:18:14.880431 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f0b1428-3468-4e47-939d-8614d302bd75-config" (OuterVolumeSpecName: "config") pod "1f0b1428-3468-4e47-939d-8614d302bd75" (UID: "1f0b1428-3468-4e47-939d-8614d302bd75"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:14 crc kubenswrapper[4802]: I1201 20:18:14.947291 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6q9ws\" (UniqueName: \"kubernetes.io/projected/1f0b1428-3468-4e47-939d-8614d302bd75-kube-api-access-6q9ws\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:14 crc kubenswrapper[4802]: I1201 20:18:14.947321 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f0b1428-3468-4e47-939d-8614d302bd75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:14 crc kubenswrapper[4802]: I1201 20:18:14.947331 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f0b1428-3468-4e47-939d-8614d302bd75-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.340874 4802 generic.go:334] "Generic (PLEG): container finished" podID="1aa8400d-0d21-46be-8bb9-3dd4996d8500" containerID="15128db726beff39f1169ffe2a39e3e64d485d46016f5c6895cca19404c5b2cd" exitCode=0 Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.340908 4802 generic.go:334] "Generic (PLEG): container finished" podID="1aa8400d-0d21-46be-8bb9-3dd4996d8500" containerID="883be8e9c0bedf2001fa38af132bce7612ab18e4da2a4baf9a2a11d2a6868245" exitCode=2 Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.340941 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1aa8400d-0d21-46be-8bb9-3dd4996d8500","Type":"ContainerDied","Data":"15128db726beff39f1169ffe2a39e3e64d485d46016f5c6895cca19404c5b2cd"} Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.341010 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1aa8400d-0d21-46be-8bb9-3dd4996d8500","Type":"ContainerDied","Data":"883be8e9c0bedf2001fa38af132bce7612ab18e4da2a4baf9a2a11d2a6868245"} Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.343249 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-646fbc85dd-2ttbm" event={"ID":"3853723d-4452-4112-99bf-c3850a983f5d","Type":"ContainerStarted","Data":"e55fdf1fa16734aff0298142f62e911183852ab16ff741c14b22402b2fd5f0e1"} Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.344334 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-646fbc85dd-2ttbm" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.344389 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-646fbc85dd-2ttbm" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.348554 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-xcrnm" event={"ID":"1f0b1428-3468-4e47-939d-8614d302bd75","Type":"ContainerDied","Data":"347c575746a70dfa0ac025397675a29123a49109d9ec1d9e8d352c05e93af8a8"} Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.348598 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="347c575746a70dfa0ac025397675a29123a49109d9ec1d9e8d352c05e93af8a8" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.348679 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-xcrnm" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.382173 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-646fbc85dd-2ttbm" podStartSLOduration=2.3821372370000002 podStartE2EDuration="2.382137237s" podCreationTimestamp="2025-12-01 20:18:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:18:15.373311369 +0000 UTC m=+1316.935871030" watchObservedRunningTime="2025-12-01 20:18:15.382137237 +0000 UTC m=+1316.944696878" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.707698 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-4k6x7"] Dec 01 20:18:15 crc kubenswrapper[4802]: E1201 20:18:15.708123 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f0b1428-3468-4e47-939d-8614d302bd75" containerName="neutron-db-sync" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.708146 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f0b1428-3468-4e47-939d-8614d302bd75" containerName="neutron-db-sync" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.708343 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f0b1428-3468-4e47-939d-8614d302bd75" containerName="neutron-db-sync" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.709248 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-4k6x7" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.756043 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-4k6x7"] Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.764859 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01af2259-c8f1-4bd5-b68c-962053274ff7-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-4k6x7\" (UID: \"01af2259-c8f1-4bd5-b68c-962053274ff7\") " pod="openstack/dnsmasq-dns-7b946d459c-4k6x7" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.765295 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01af2259-c8f1-4bd5-b68c-962053274ff7-config\") pod \"dnsmasq-dns-7b946d459c-4k6x7\" (UID: \"01af2259-c8f1-4bd5-b68c-962053274ff7\") " pod="openstack/dnsmasq-dns-7b946d459c-4k6x7" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.765355 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgqk7\" (UniqueName: \"kubernetes.io/projected/01af2259-c8f1-4bd5-b68c-962053274ff7-kube-api-access-bgqk7\") pod \"dnsmasq-dns-7b946d459c-4k6x7\" (UID: \"01af2259-c8f1-4bd5-b68c-962053274ff7\") " pod="openstack/dnsmasq-dns-7b946d459c-4k6x7" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.765389 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01af2259-c8f1-4bd5-b68c-962053274ff7-dns-svc\") pod \"dnsmasq-dns-7b946d459c-4k6x7\" (UID: \"01af2259-c8f1-4bd5-b68c-962053274ff7\") " pod="openstack/dnsmasq-dns-7b946d459c-4k6x7" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.765433 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01af2259-c8f1-4bd5-b68c-962053274ff7-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-4k6x7\" (UID: \"01af2259-c8f1-4bd5-b68c-962053274ff7\") " pod="openstack/dnsmasq-dns-7b946d459c-4k6x7" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.824915 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5967795cb6-dtwwt"] Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.834547 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5967795cb6-dtwwt" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.844030 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.844301 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-7v6r6" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.844440 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.847143 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.852408 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5967795cb6-dtwwt"] Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.869289 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01af2259-c8f1-4bd5-b68c-962053274ff7-dns-svc\") pod \"dnsmasq-dns-7b946d459c-4k6x7\" (UID: \"01af2259-c8f1-4bd5-b68c-962053274ff7\") " pod="openstack/dnsmasq-dns-7b946d459c-4k6x7" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.869359 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d67137e5-b9ea-4ca0-8851-a7f041ad745e-ovndb-tls-certs\") pod \"neutron-5967795cb6-dtwwt\" (UID: \"d67137e5-b9ea-4ca0-8851-a7f041ad745e\") " pod="openstack/neutron-5967795cb6-dtwwt" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.869400 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01af2259-c8f1-4bd5-b68c-962053274ff7-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-4k6x7\" (UID: \"01af2259-c8f1-4bd5-b68c-962053274ff7\") " pod="openstack/dnsmasq-dns-7b946d459c-4k6x7" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.869541 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d67137e5-b9ea-4ca0-8851-a7f041ad745e-config\") pod \"neutron-5967795cb6-dtwwt\" (UID: \"d67137e5-b9ea-4ca0-8851-a7f041ad745e\") " pod="openstack/neutron-5967795cb6-dtwwt" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.869588 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d67137e5-b9ea-4ca0-8851-a7f041ad745e-httpd-config\") pod \"neutron-5967795cb6-dtwwt\" (UID: \"d67137e5-b9ea-4ca0-8851-a7f041ad745e\") " pod="openstack/neutron-5967795cb6-dtwwt" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.869634 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d67137e5-b9ea-4ca0-8851-a7f041ad745e-combined-ca-bundle\") pod \"neutron-5967795cb6-dtwwt\" (UID: \"d67137e5-b9ea-4ca0-8851-a7f041ad745e\") " pod="openstack/neutron-5967795cb6-dtwwt" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.871314 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01af2259-c8f1-4bd5-b68c-962053274ff7-dns-svc\") pod \"dnsmasq-dns-7b946d459c-4k6x7\" (UID: \"01af2259-c8f1-4bd5-b68c-962053274ff7\") " pod="openstack/dnsmasq-dns-7b946d459c-4k6x7" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.871900 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01af2259-c8f1-4bd5-b68c-962053274ff7-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-4k6x7\" (UID: \"01af2259-c8f1-4bd5-b68c-962053274ff7\") " pod="openstack/dnsmasq-dns-7b946d459c-4k6x7" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.876577 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01af2259-c8f1-4bd5-b68c-962053274ff7-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-4k6x7\" (UID: \"01af2259-c8f1-4bd5-b68c-962053274ff7\") " pod="openstack/dnsmasq-dns-7b946d459c-4k6x7" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.876651 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01af2259-c8f1-4bd5-b68c-962053274ff7-config\") pod \"dnsmasq-dns-7b946d459c-4k6x7\" (UID: \"01af2259-c8f1-4bd5-b68c-962053274ff7\") " pod="openstack/dnsmasq-dns-7b946d459c-4k6x7" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.876720 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h78t\" (UniqueName: \"kubernetes.io/projected/d67137e5-b9ea-4ca0-8851-a7f041ad745e-kube-api-access-8h78t\") pod \"neutron-5967795cb6-dtwwt\" (UID: \"d67137e5-b9ea-4ca0-8851-a7f041ad745e\") " pod="openstack/neutron-5967795cb6-dtwwt" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.876814 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgqk7\" (UniqueName: \"kubernetes.io/projected/01af2259-c8f1-4bd5-b68c-962053274ff7-kube-api-access-bgqk7\") pod \"dnsmasq-dns-7b946d459c-4k6x7\" (UID: \"01af2259-c8f1-4bd5-b68c-962053274ff7\") " pod="openstack/dnsmasq-dns-7b946d459c-4k6x7" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.878007 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01af2259-c8f1-4bd5-b68c-962053274ff7-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-4k6x7\" (UID: \"01af2259-c8f1-4bd5-b68c-962053274ff7\") " pod="openstack/dnsmasq-dns-7b946d459c-4k6x7" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.878013 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01af2259-c8f1-4bd5-b68c-962053274ff7-config\") pod \"dnsmasq-dns-7b946d459c-4k6x7\" (UID: \"01af2259-c8f1-4bd5-b68c-962053274ff7\") " pod="openstack/dnsmasq-dns-7b946d459c-4k6x7" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.908632 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgqk7\" (UniqueName: \"kubernetes.io/projected/01af2259-c8f1-4bd5-b68c-962053274ff7-kube-api-access-bgqk7\") pod \"dnsmasq-dns-7b946d459c-4k6x7\" (UID: \"01af2259-c8f1-4bd5-b68c-962053274ff7\") " pod="openstack/dnsmasq-dns-7b946d459c-4k6x7" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.942937 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-4k6x7" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.979144 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d67137e5-b9ea-4ca0-8851-a7f041ad745e-httpd-config\") pod \"neutron-5967795cb6-dtwwt\" (UID: \"d67137e5-b9ea-4ca0-8851-a7f041ad745e\") " pod="openstack/neutron-5967795cb6-dtwwt" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.979385 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d67137e5-b9ea-4ca0-8851-a7f041ad745e-combined-ca-bundle\") pod \"neutron-5967795cb6-dtwwt\" (UID: \"d67137e5-b9ea-4ca0-8851-a7f041ad745e\") " pod="openstack/neutron-5967795cb6-dtwwt" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.979516 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h78t\" (UniqueName: \"kubernetes.io/projected/d67137e5-b9ea-4ca0-8851-a7f041ad745e-kube-api-access-8h78t\") pod \"neutron-5967795cb6-dtwwt\" (UID: \"d67137e5-b9ea-4ca0-8851-a7f041ad745e\") " pod="openstack/neutron-5967795cb6-dtwwt" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.980208 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d67137e5-b9ea-4ca0-8851-a7f041ad745e-ovndb-tls-certs\") pod \"neutron-5967795cb6-dtwwt\" (UID: \"d67137e5-b9ea-4ca0-8851-a7f041ad745e\") " pod="openstack/neutron-5967795cb6-dtwwt" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.980336 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d67137e5-b9ea-4ca0-8851-a7f041ad745e-config\") pod \"neutron-5967795cb6-dtwwt\" (UID: \"d67137e5-b9ea-4ca0-8851-a7f041ad745e\") " pod="openstack/neutron-5967795cb6-dtwwt" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.990889 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d67137e5-b9ea-4ca0-8851-a7f041ad745e-httpd-config\") pod \"neutron-5967795cb6-dtwwt\" (UID: \"d67137e5-b9ea-4ca0-8851-a7f041ad745e\") " pod="openstack/neutron-5967795cb6-dtwwt" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.991685 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d67137e5-b9ea-4ca0-8851-a7f041ad745e-config\") pod \"neutron-5967795cb6-dtwwt\" (UID: \"d67137e5-b9ea-4ca0-8851-a7f041ad745e\") " pod="openstack/neutron-5967795cb6-dtwwt" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.992348 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d67137e5-b9ea-4ca0-8851-a7f041ad745e-ovndb-tls-certs\") pod \"neutron-5967795cb6-dtwwt\" (UID: \"d67137e5-b9ea-4ca0-8851-a7f041ad745e\") " pod="openstack/neutron-5967795cb6-dtwwt" Dec 01 20:18:15 crc kubenswrapper[4802]: I1201 20:18:15.997808 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d67137e5-b9ea-4ca0-8851-a7f041ad745e-combined-ca-bundle\") pod \"neutron-5967795cb6-dtwwt\" (UID: \"d67137e5-b9ea-4ca0-8851-a7f041ad745e\") " pod="openstack/neutron-5967795cb6-dtwwt" Dec 01 20:18:16 crc kubenswrapper[4802]: I1201 20:18:16.003747 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h78t\" (UniqueName: \"kubernetes.io/projected/d67137e5-b9ea-4ca0-8851-a7f041ad745e-kube-api-access-8h78t\") pod \"neutron-5967795cb6-dtwwt\" (UID: \"d67137e5-b9ea-4ca0-8851-a7f041ad745e\") " pod="openstack/neutron-5967795cb6-dtwwt" Dec 01 20:18:16 crc kubenswrapper[4802]: I1201 20:18:16.261912 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5967795cb6-dtwwt" Dec 01 20:18:16 crc kubenswrapper[4802]: I1201 20:18:16.520734 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-4k6x7"] Dec 01 20:18:16 crc kubenswrapper[4802]: I1201 20:18:16.894424 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5967795cb6-dtwwt"] Dec 01 20:18:16 crc kubenswrapper[4802]: W1201 20:18:16.896857 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd67137e5_b9ea_4ca0_8851_a7f041ad745e.slice/crio-066f87775eb6b49666ecfe8e20bea315a8d035ac18558352736c87ba9c1b5bdf WatchSource:0}: Error finding container 066f87775eb6b49666ecfe8e20bea315a8d035ac18558352736c87ba9c1b5bdf: Status 404 returned error can't find the container with id 066f87775eb6b49666ecfe8e20bea315a8d035ac18558352736c87ba9c1b5bdf Dec 01 20:18:17 crc kubenswrapper[4802]: I1201 20:18:17.369731 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-p259k" event={"ID":"93f6c930-0ed7-480e-8725-692427ba2b9d","Type":"ContainerStarted","Data":"746c1a240b5c5c18cc9f61fb2c37fbff5380dc9cad4c85ea9cc09ab494a06a5d"} Dec 01 20:18:17 crc kubenswrapper[4802]: I1201 20:18:17.372164 4802 generic.go:334] "Generic (PLEG): container finished" podID="01af2259-c8f1-4bd5-b68c-962053274ff7" containerID="acae116baa8006de268f6e9687ff7dc4e6d4856eb02885df367a94ca47216330" exitCode=0 Dec 01 20:18:17 crc kubenswrapper[4802]: I1201 20:18:17.372257 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-4k6x7" event={"ID":"01af2259-c8f1-4bd5-b68c-962053274ff7","Type":"ContainerDied","Data":"acae116baa8006de268f6e9687ff7dc4e6d4856eb02885df367a94ca47216330"} Dec 01 20:18:17 crc kubenswrapper[4802]: I1201 20:18:17.372286 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-4k6x7" event={"ID":"01af2259-c8f1-4bd5-b68c-962053274ff7","Type":"ContainerStarted","Data":"e78f04afd440bf1a467ca11b942f0d98b33d870cd250bf8c5fc9bb0a2d2152ea"} Dec 01 20:18:17 crc kubenswrapper[4802]: I1201 20:18:17.393093 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-p259k" podStartSLOduration=3.62961321 podStartE2EDuration="49.393076848s" podCreationTimestamp="2025-12-01 20:17:28 +0000 UTC" firstStartedPulling="2025-12-01 20:17:29.951880689 +0000 UTC m=+1271.514440330" lastFinishedPulling="2025-12-01 20:18:15.715344327 +0000 UTC m=+1317.277903968" observedRunningTime="2025-12-01 20:18:17.391641952 +0000 UTC m=+1318.954201603" watchObservedRunningTime="2025-12-01 20:18:17.393076848 +0000 UTC m=+1318.955636489" Dec 01 20:18:17 crc kubenswrapper[4802]: I1201 20:18:17.408440 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5967795cb6-dtwwt" event={"ID":"d67137e5-b9ea-4ca0-8851-a7f041ad745e","Type":"ContainerStarted","Data":"bb171adba622a509decad3632fd36cd6a4d820670ea158a5d1f714bc72223b49"} Dec 01 20:18:17 crc kubenswrapper[4802]: I1201 20:18:17.408488 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5967795cb6-dtwwt" event={"ID":"d67137e5-b9ea-4ca0-8851-a7f041ad745e","Type":"ContainerStarted","Data":"a00a0d7991d7381f3b22c0ca3b8cccab6dd6c98f0dab5c53cf683ed0aebf5d75"} Dec 01 20:18:17 crc kubenswrapper[4802]: I1201 20:18:17.408497 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5967795cb6-dtwwt" event={"ID":"d67137e5-b9ea-4ca0-8851-a7f041ad745e","Type":"ContainerStarted","Data":"066f87775eb6b49666ecfe8e20bea315a8d035ac18558352736c87ba9c1b5bdf"} Dec 01 20:18:17 crc kubenswrapper[4802]: I1201 20:18:17.409242 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5967795cb6-dtwwt" Dec 01 20:18:17 crc kubenswrapper[4802]: I1201 20:18:17.458803 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5967795cb6-dtwwt" podStartSLOduration=2.458783311 podStartE2EDuration="2.458783311s" podCreationTimestamp="2025-12-01 20:18:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:18:17.457097977 +0000 UTC m=+1319.019657618" watchObservedRunningTime="2025-12-01 20:18:17.458783311 +0000 UTC m=+1319.021342952" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.038770 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.136959 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-84c4b4b5d7-2ph8r"] Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.137377 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa8400d-0d21-46be-8bb9-3dd4996d8500-combined-ca-bundle\") pod \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\" (UID: \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\") " Dec 01 20:18:18 crc kubenswrapper[4802]: E1201 20:18:18.137423 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa8400d-0d21-46be-8bb9-3dd4996d8500" containerName="sg-core" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.137435 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa8400d-0d21-46be-8bb9-3dd4996d8500" containerName="sg-core" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.137437 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1aa8400d-0d21-46be-8bb9-3dd4996d8500-log-httpd\") pod \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\" (UID: \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\") " Dec 01 20:18:18 crc kubenswrapper[4802]: E1201 20:18:18.137458 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa8400d-0d21-46be-8bb9-3dd4996d8500" containerName="ceilometer-notification-agent" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.137460 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aa8400d-0d21-46be-8bb9-3dd4996d8500-scripts\") pod \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\" (UID: \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\") " Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.137464 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa8400d-0d21-46be-8bb9-3dd4996d8500" containerName="ceilometer-notification-agent" Dec 01 20:18:18 crc kubenswrapper[4802]: E1201 20:18:18.137481 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa8400d-0d21-46be-8bb9-3dd4996d8500" containerName="proxy-httpd" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.137488 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa8400d-0d21-46be-8bb9-3dd4996d8500" containerName="proxy-httpd" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.137586 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1aa8400d-0d21-46be-8bb9-3dd4996d8500-sg-core-conf-yaml\") pod \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\" (UID: \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\") " Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.137632 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxgpv\" (UniqueName: \"kubernetes.io/projected/1aa8400d-0d21-46be-8bb9-3dd4996d8500-kube-api-access-dxgpv\") pod \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\" (UID: \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\") " Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.137671 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa8400d-0d21-46be-8bb9-3dd4996d8500" containerName="proxy-httpd" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.137690 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa8400d-0d21-46be-8bb9-3dd4996d8500" containerName="ceilometer-notification-agent" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.137705 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa8400d-0d21-46be-8bb9-3dd4996d8500" containerName="sg-core" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.137791 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1aa8400d-0d21-46be-8bb9-3dd4996d8500-run-httpd\") pod \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\" (UID: \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\") " Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.137843 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa8400d-0d21-46be-8bb9-3dd4996d8500-config-data\") pod \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\" (UID: \"1aa8400d-0d21-46be-8bb9-3dd4996d8500\") " Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.138615 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84c4b4b5d7-2ph8r" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.140557 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aa8400d-0d21-46be-8bb9-3dd4996d8500-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1aa8400d-0d21-46be-8bb9-3dd4996d8500" (UID: "1aa8400d-0d21-46be-8bb9-3dd4996d8500"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.141740 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aa8400d-0d21-46be-8bb9-3dd4996d8500-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1aa8400d-0d21-46be-8bb9-3dd4996d8500" (UID: "1aa8400d-0d21-46be-8bb9-3dd4996d8500"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.141935 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.142121 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.151404 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa8400d-0d21-46be-8bb9-3dd4996d8500-kube-api-access-dxgpv" (OuterVolumeSpecName: "kube-api-access-dxgpv") pod "1aa8400d-0d21-46be-8bb9-3dd4996d8500" (UID: "1aa8400d-0d21-46be-8bb9-3dd4996d8500"). InnerVolumeSpecName "kube-api-access-dxgpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.152822 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aa8400d-0d21-46be-8bb9-3dd4996d8500-scripts" (OuterVolumeSpecName: "scripts") pod "1aa8400d-0d21-46be-8bb9-3dd4996d8500" (UID: "1aa8400d-0d21-46be-8bb9-3dd4996d8500"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.159076 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-84c4b4b5d7-2ph8r"] Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.186518 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aa8400d-0d21-46be-8bb9-3dd4996d8500-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1aa8400d-0d21-46be-8bb9-3dd4996d8500" (UID: "1aa8400d-0d21-46be-8bb9-3dd4996d8500"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.206403 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aa8400d-0d21-46be-8bb9-3dd4996d8500-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1aa8400d-0d21-46be-8bb9-3dd4996d8500" (UID: "1aa8400d-0d21-46be-8bb9-3dd4996d8500"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.241134 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/86540934-a020-4a27-bfa6-62fbc7cfe412-httpd-config\") pod \"neutron-84c4b4b5d7-2ph8r\" (UID: \"86540934-a020-4a27-bfa6-62fbc7cfe412\") " pod="openstack/neutron-84c4b4b5d7-2ph8r" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.241190 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86540934-a020-4a27-bfa6-62fbc7cfe412-internal-tls-certs\") pod \"neutron-84c4b4b5d7-2ph8r\" (UID: \"86540934-a020-4a27-bfa6-62fbc7cfe412\") " pod="openstack/neutron-84c4b4b5d7-2ph8r" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.241243 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/86540934-a020-4a27-bfa6-62fbc7cfe412-config\") pod \"neutron-84c4b4b5d7-2ph8r\" (UID: \"86540934-a020-4a27-bfa6-62fbc7cfe412\") " pod="openstack/neutron-84c4b4b5d7-2ph8r" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.241295 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86540934-a020-4a27-bfa6-62fbc7cfe412-public-tls-certs\") pod \"neutron-84c4b4b5d7-2ph8r\" (UID: \"86540934-a020-4a27-bfa6-62fbc7cfe412\") " pod="openstack/neutron-84c4b4b5d7-2ph8r" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.241333 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/86540934-a020-4a27-bfa6-62fbc7cfe412-ovndb-tls-certs\") pod \"neutron-84c4b4b5d7-2ph8r\" (UID: \"86540934-a020-4a27-bfa6-62fbc7cfe412\") " pod="openstack/neutron-84c4b4b5d7-2ph8r" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.241364 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86540934-a020-4a27-bfa6-62fbc7cfe412-combined-ca-bundle\") pod \"neutron-84c4b4b5d7-2ph8r\" (UID: \"86540934-a020-4a27-bfa6-62fbc7cfe412\") " pod="openstack/neutron-84c4b4b5d7-2ph8r" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.241400 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccz24\" (UniqueName: \"kubernetes.io/projected/86540934-a020-4a27-bfa6-62fbc7cfe412-kube-api-access-ccz24\") pod \"neutron-84c4b4b5d7-2ph8r\" (UID: \"86540934-a020-4a27-bfa6-62fbc7cfe412\") " pod="openstack/neutron-84c4b4b5d7-2ph8r" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.241463 4802 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1aa8400d-0d21-46be-8bb9-3dd4996d8500-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.241476 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxgpv\" (UniqueName: \"kubernetes.io/projected/1aa8400d-0d21-46be-8bb9-3dd4996d8500-kube-api-access-dxgpv\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.241486 4802 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1aa8400d-0d21-46be-8bb9-3dd4996d8500-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.241495 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa8400d-0d21-46be-8bb9-3dd4996d8500-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.241504 4802 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1aa8400d-0d21-46be-8bb9-3dd4996d8500-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.241512 4802 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aa8400d-0d21-46be-8bb9-3dd4996d8500-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.277471 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aa8400d-0d21-46be-8bb9-3dd4996d8500-config-data" (OuterVolumeSpecName: "config-data") pod "1aa8400d-0d21-46be-8bb9-3dd4996d8500" (UID: "1aa8400d-0d21-46be-8bb9-3dd4996d8500"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.349219 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccz24\" (UniqueName: \"kubernetes.io/projected/86540934-a020-4a27-bfa6-62fbc7cfe412-kube-api-access-ccz24\") pod \"neutron-84c4b4b5d7-2ph8r\" (UID: \"86540934-a020-4a27-bfa6-62fbc7cfe412\") " pod="openstack/neutron-84c4b4b5d7-2ph8r" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.349516 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/86540934-a020-4a27-bfa6-62fbc7cfe412-httpd-config\") pod \"neutron-84c4b4b5d7-2ph8r\" (UID: \"86540934-a020-4a27-bfa6-62fbc7cfe412\") " pod="openstack/neutron-84c4b4b5d7-2ph8r" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.349597 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86540934-a020-4a27-bfa6-62fbc7cfe412-internal-tls-certs\") pod \"neutron-84c4b4b5d7-2ph8r\" (UID: \"86540934-a020-4a27-bfa6-62fbc7cfe412\") " pod="openstack/neutron-84c4b4b5d7-2ph8r" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.349676 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/86540934-a020-4a27-bfa6-62fbc7cfe412-config\") pod \"neutron-84c4b4b5d7-2ph8r\" (UID: \"86540934-a020-4a27-bfa6-62fbc7cfe412\") " pod="openstack/neutron-84c4b4b5d7-2ph8r" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.349900 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86540934-a020-4a27-bfa6-62fbc7cfe412-public-tls-certs\") pod \"neutron-84c4b4b5d7-2ph8r\" (UID: \"86540934-a020-4a27-bfa6-62fbc7cfe412\") " pod="openstack/neutron-84c4b4b5d7-2ph8r" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.349988 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/86540934-a020-4a27-bfa6-62fbc7cfe412-ovndb-tls-certs\") pod \"neutron-84c4b4b5d7-2ph8r\" (UID: \"86540934-a020-4a27-bfa6-62fbc7cfe412\") " pod="openstack/neutron-84c4b4b5d7-2ph8r" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.350062 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86540934-a020-4a27-bfa6-62fbc7cfe412-combined-ca-bundle\") pod \"neutron-84c4b4b5d7-2ph8r\" (UID: \"86540934-a020-4a27-bfa6-62fbc7cfe412\") " pod="openstack/neutron-84c4b4b5d7-2ph8r" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.350153 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa8400d-0d21-46be-8bb9-3dd4996d8500-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.359341 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/86540934-a020-4a27-bfa6-62fbc7cfe412-config\") pod \"neutron-84c4b4b5d7-2ph8r\" (UID: \"86540934-a020-4a27-bfa6-62fbc7cfe412\") " pod="openstack/neutron-84c4b4b5d7-2ph8r" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.362268 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/86540934-a020-4a27-bfa6-62fbc7cfe412-httpd-config\") pod \"neutron-84c4b4b5d7-2ph8r\" (UID: \"86540934-a020-4a27-bfa6-62fbc7cfe412\") " pod="openstack/neutron-84c4b4b5d7-2ph8r" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.363843 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86540934-a020-4a27-bfa6-62fbc7cfe412-combined-ca-bundle\") pod \"neutron-84c4b4b5d7-2ph8r\" (UID: \"86540934-a020-4a27-bfa6-62fbc7cfe412\") " pod="openstack/neutron-84c4b4b5d7-2ph8r" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.364846 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86540934-a020-4a27-bfa6-62fbc7cfe412-internal-tls-certs\") pod \"neutron-84c4b4b5d7-2ph8r\" (UID: \"86540934-a020-4a27-bfa6-62fbc7cfe412\") " pod="openstack/neutron-84c4b4b5d7-2ph8r" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.373982 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86540934-a020-4a27-bfa6-62fbc7cfe412-public-tls-certs\") pod \"neutron-84c4b4b5d7-2ph8r\" (UID: \"86540934-a020-4a27-bfa6-62fbc7cfe412\") " pod="openstack/neutron-84c4b4b5d7-2ph8r" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.387974 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/86540934-a020-4a27-bfa6-62fbc7cfe412-ovndb-tls-certs\") pod \"neutron-84c4b4b5d7-2ph8r\" (UID: \"86540934-a020-4a27-bfa6-62fbc7cfe412\") " pod="openstack/neutron-84c4b4b5d7-2ph8r" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.393456 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccz24\" (UniqueName: \"kubernetes.io/projected/86540934-a020-4a27-bfa6-62fbc7cfe412-kube-api-access-ccz24\") pod \"neutron-84c4b4b5d7-2ph8r\" (UID: \"86540934-a020-4a27-bfa6-62fbc7cfe412\") " pod="openstack/neutron-84c4b4b5d7-2ph8r" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.416891 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-4k6x7" event={"ID":"01af2259-c8f1-4bd5-b68c-962053274ff7","Type":"ContainerStarted","Data":"e0f92db194ba68a779e2b43b07484c19cb85dbf986170a76a06a129493155d95"} Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.417023 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b946d459c-4k6x7" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.419794 4802 generic.go:334] "Generic (PLEG): container finished" podID="1aa8400d-0d21-46be-8bb9-3dd4996d8500" containerID="556aee86c21bf26b6292bcec45f1ef50a44079ed4a4b7beb98139c3053279302" exitCode=0 Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.419863 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1aa8400d-0d21-46be-8bb9-3dd4996d8500","Type":"ContainerDied","Data":"556aee86c21bf26b6292bcec45f1ef50a44079ed4a4b7beb98139c3053279302"} Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.419886 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1aa8400d-0d21-46be-8bb9-3dd4996d8500","Type":"ContainerDied","Data":"bc96652feadfde98d95e120e56ca086d2bc4fb2ba1f2854522c7091a9abd1849"} Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.419902 4802 scope.go:117] "RemoveContainer" containerID="15128db726beff39f1169ffe2a39e3e64d485d46016f5c6895cca19404c5b2cd" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.420010 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.426139 4802 generic.go:334] "Generic (PLEG): container finished" podID="4d086846-e371-492b-81c2-0fb443b14f30" containerID="4c2b475b9b0a9df89ec774654ac31deeee6d7364aaa82df14e1cfbcbc0456ff5" exitCode=0 Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.426224 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hkl2c" event={"ID":"4d086846-e371-492b-81c2-0fb443b14f30","Type":"ContainerDied","Data":"4c2b475b9b0a9df89ec774654ac31deeee6d7364aaa82df14e1cfbcbc0456ff5"} Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.442422 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b946d459c-4k6x7" podStartSLOduration=3.44240639 podStartE2EDuration="3.44240639s" podCreationTimestamp="2025-12-01 20:18:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:18:18.439221459 +0000 UTC m=+1320.001781110" watchObservedRunningTime="2025-12-01 20:18:18.44240639 +0000 UTC m=+1320.004966031" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.453621 4802 scope.go:117] "RemoveContainer" containerID="883be8e9c0bedf2001fa38af132bce7612ab18e4da2a4baf9a2a11d2a6868245" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.486086 4802 scope.go:117] "RemoveContainer" containerID="556aee86c21bf26b6292bcec45f1ef50a44079ed4a4b7beb98139c3053279302" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.501785 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.509391 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.514627 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84c4b4b5d7-2ph8r" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.530991 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.577130 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.577290 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.581725 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.581741 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.600111 4802 scope.go:117] "RemoveContainer" containerID="15128db726beff39f1169ffe2a39e3e64d485d46016f5c6895cca19404c5b2cd" Dec 01 20:18:18 crc kubenswrapper[4802]: E1201 20:18:18.606735 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15128db726beff39f1169ffe2a39e3e64d485d46016f5c6895cca19404c5b2cd\": container with ID starting with 15128db726beff39f1169ffe2a39e3e64d485d46016f5c6895cca19404c5b2cd not found: ID does not exist" containerID="15128db726beff39f1169ffe2a39e3e64d485d46016f5c6895cca19404c5b2cd" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.606786 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15128db726beff39f1169ffe2a39e3e64d485d46016f5c6895cca19404c5b2cd"} err="failed to get container status \"15128db726beff39f1169ffe2a39e3e64d485d46016f5c6895cca19404c5b2cd\": rpc error: code = NotFound desc = could not find container \"15128db726beff39f1169ffe2a39e3e64d485d46016f5c6895cca19404c5b2cd\": container with ID starting with 15128db726beff39f1169ffe2a39e3e64d485d46016f5c6895cca19404c5b2cd not found: ID does not exist" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.606813 4802 scope.go:117] "RemoveContainer" containerID="883be8e9c0bedf2001fa38af132bce7612ab18e4da2a4baf9a2a11d2a6868245" Dec 01 20:18:18 crc kubenswrapper[4802]: E1201 20:18:18.607438 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"883be8e9c0bedf2001fa38af132bce7612ab18e4da2a4baf9a2a11d2a6868245\": container with ID starting with 883be8e9c0bedf2001fa38af132bce7612ab18e4da2a4baf9a2a11d2a6868245 not found: ID does not exist" containerID="883be8e9c0bedf2001fa38af132bce7612ab18e4da2a4baf9a2a11d2a6868245" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.607460 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"883be8e9c0bedf2001fa38af132bce7612ab18e4da2a4baf9a2a11d2a6868245"} err="failed to get container status \"883be8e9c0bedf2001fa38af132bce7612ab18e4da2a4baf9a2a11d2a6868245\": rpc error: code = NotFound desc = could not find container \"883be8e9c0bedf2001fa38af132bce7612ab18e4da2a4baf9a2a11d2a6868245\": container with ID starting with 883be8e9c0bedf2001fa38af132bce7612ab18e4da2a4baf9a2a11d2a6868245 not found: ID does not exist" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.607472 4802 scope.go:117] "RemoveContainer" containerID="556aee86c21bf26b6292bcec45f1ef50a44079ed4a4b7beb98139c3053279302" Dec 01 20:18:18 crc kubenswrapper[4802]: E1201 20:18:18.607856 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"556aee86c21bf26b6292bcec45f1ef50a44079ed4a4b7beb98139c3053279302\": container with ID starting with 556aee86c21bf26b6292bcec45f1ef50a44079ed4a4b7beb98139c3053279302 not found: ID does not exist" containerID="556aee86c21bf26b6292bcec45f1ef50a44079ed4a4b7beb98139c3053279302" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.607883 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"556aee86c21bf26b6292bcec45f1ef50a44079ed4a4b7beb98139c3053279302"} err="failed to get container status \"556aee86c21bf26b6292bcec45f1ef50a44079ed4a4b7beb98139c3053279302\": rpc error: code = NotFound desc = could not find container \"556aee86c21bf26b6292bcec45f1ef50a44079ed4a4b7beb98139c3053279302\": container with ID starting with 556aee86c21bf26b6292bcec45f1ef50a44079ed4a4b7beb98139c3053279302 not found: ID does not exist" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.746739 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aa8400d-0d21-46be-8bb9-3dd4996d8500" path="/var/lib/kubelet/pods/1aa8400d-0d21-46be-8bb9-3dd4996d8500/volumes" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.758830 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b30d2288-725d-4cdf-9297-ad9afee36691-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b30d2288-725d-4cdf-9297-ad9afee36691\") " pod="openstack/ceilometer-0" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.758887 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b30d2288-725d-4cdf-9297-ad9afee36691-scripts\") pod \"ceilometer-0\" (UID: \"b30d2288-725d-4cdf-9297-ad9afee36691\") " pod="openstack/ceilometer-0" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.758922 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b30d2288-725d-4cdf-9297-ad9afee36691-config-data\") pod \"ceilometer-0\" (UID: \"b30d2288-725d-4cdf-9297-ad9afee36691\") " pod="openstack/ceilometer-0" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.759054 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b30d2288-725d-4cdf-9297-ad9afee36691-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b30d2288-725d-4cdf-9297-ad9afee36691\") " pod="openstack/ceilometer-0" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.759165 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b30d2288-725d-4cdf-9297-ad9afee36691-run-httpd\") pod \"ceilometer-0\" (UID: \"b30d2288-725d-4cdf-9297-ad9afee36691\") " pod="openstack/ceilometer-0" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.759316 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvllr\" (UniqueName: \"kubernetes.io/projected/b30d2288-725d-4cdf-9297-ad9afee36691-kube-api-access-nvllr\") pod \"ceilometer-0\" (UID: \"b30d2288-725d-4cdf-9297-ad9afee36691\") " pod="openstack/ceilometer-0" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.759347 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b30d2288-725d-4cdf-9297-ad9afee36691-log-httpd\") pod \"ceilometer-0\" (UID: \"b30d2288-725d-4cdf-9297-ad9afee36691\") " pod="openstack/ceilometer-0" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.861354 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvllr\" (UniqueName: \"kubernetes.io/projected/b30d2288-725d-4cdf-9297-ad9afee36691-kube-api-access-nvllr\") pod \"ceilometer-0\" (UID: \"b30d2288-725d-4cdf-9297-ad9afee36691\") " pod="openstack/ceilometer-0" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.861416 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b30d2288-725d-4cdf-9297-ad9afee36691-log-httpd\") pod \"ceilometer-0\" (UID: \"b30d2288-725d-4cdf-9297-ad9afee36691\") " pod="openstack/ceilometer-0" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.861491 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b30d2288-725d-4cdf-9297-ad9afee36691-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b30d2288-725d-4cdf-9297-ad9afee36691\") " pod="openstack/ceilometer-0" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.861535 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b30d2288-725d-4cdf-9297-ad9afee36691-scripts\") pod \"ceilometer-0\" (UID: \"b30d2288-725d-4cdf-9297-ad9afee36691\") " pod="openstack/ceilometer-0" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.861589 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b30d2288-725d-4cdf-9297-ad9afee36691-config-data\") pod \"ceilometer-0\" (UID: \"b30d2288-725d-4cdf-9297-ad9afee36691\") " pod="openstack/ceilometer-0" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.861649 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b30d2288-725d-4cdf-9297-ad9afee36691-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b30d2288-725d-4cdf-9297-ad9afee36691\") " pod="openstack/ceilometer-0" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.861695 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b30d2288-725d-4cdf-9297-ad9afee36691-run-httpd\") pod \"ceilometer-0\" (UID: \"b30d2288-725d-4cdf-9297-ad9afee36691\") " pod="openstack/ceilometer-0" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.862348 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b30d2288-725d-4cdf-9297-ad9afee36691-run-httpd\") pod \"ceilometer-0\" (UID: \"b30d2288-725d-4cdf-9297-ad9afee36691\") " pod="openstack/ceilometer-0" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.862472 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b30d2288-725d-4cdf-9297-ad9afee36691-log-httpd\") pod \"ceilometer-0\" (UID: \"b30d2288-725d-4cdf-9297-ad9afee36691\") " pod="openstack/ceilometer-0" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.864293 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.864649 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.867004 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b30d2288-725d-4cdf-9297-ad9afee36691-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b30d2288-725d-4cdf-9297-ad9afee36691\") " pod="openstack/ceilometer-0" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.875758 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b30d2288-725d-4cdf-9297-ad9afee36691-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b30d2288-725d-4cdf-9297-ad9afee36691\") " pod="openstack/ceilometer-0" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.877627 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b30d2288-725d-4cdf-9297-ad9afee36691-scripts\") pod \"ceilometer-0\" (UID: \"b30d2288-725d-4cdf-9297-ad9afee36691\") " pod="openstack/ceilometer-0" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.878045 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b30d2288-725d-4cdf-9297-ad9afee36691-config-data\") pod \"ceilometer-0\" (UID: \"b30d2288-725d-4cdf-9297-ad9afee36691\") " pod="openstack/ceilometer-0" Dec 01 20:18:18 crc kubenswrapper[4802]: I1201 20:18:18.972503 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvllr\" (UniqueName: \"kubernetes.io/projected/b30d2288-725d-4cdf-9297-ad9afee36691-kube-api-access-nvllr\") pod \"ceilometer-0\" (UID: \"b30d2288-725d-4cdf-9297-ad9afee36691\") " pod="openstack/ceilometer-0" Dec 01 20:18:19 crc kubenswrapper[4802]: I1201 20:18:19.128240 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-84c4b4b5d7-2ph8r"] Dec 01 20:18:19 crc kubenswrapper[4802]: W1201 20:18:19.135143 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86540934_a020_4a27_bfa6_62fbc7cfe412.slice/crio-a0b08683bb0eb71f73c88c85f2c689f9dbd40bdecbf763fa1da96a0b8498ad7f WatchSource:0}: Error finding container a0b08683bb0eb71f73c88c85f2c689f9dbd40bdecbf763fa1da96a0b8498ad7f: Status 404 returned error can't find the container with id a0b08683bb0eb71f73c88c85f2c689f9dbd40bdecbf763fa1da96a0b8498ad7f Dec 01 20:18:19 crc kubenswrapper[4802]: I1201 20:18:19.250901 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:18:19 crc kubenswrapper[4802]: I1201 20:18:19.459793 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84c4b4b5d7-2ph8r" event={"ID":"86540934-a020-4a27-bfa6-62fbc7cfe412","Type":"ContainerStarted","Data":"a0b08683bb0eb71f73c88c85f2c689f9dbd40bdecbf763fa1da96a0b8498ad7f"} Dec 01 20:18:19 crc kubenswrapper[4802]: I1201 20:18:19.557358 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:18:19 crc kubenswrapper[4802]: W1201 20:18:19.570001 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb30d2288_725d_4cdf_9297_ad9afee36691.slice/crio-03f0f77ab56bd49bcccd3a6e64a5040dbed45f4e39e0e2b55a09c49c313286ef WatchSource:0}: Error finding container 03f0f77ab56bd49bcccd3a6e64a5040dbed45f4e39e0e2b55a09c49c313286ef: Status 404 returned error can't find the container with id 03f0f77ab56bd49bcccd3a6e64a5040dbed45f4e39e0e2b55a09c49c313286ef Dec 01 20:18:19 crc kubenswrapper[4802]: I1201 20:18:19.853574 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hkl2c" Dec 01 20:18:19 crc kubenswrapper[4802]: I1201 20:18:19.997376 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4d086846-e371-492b-81c2-0fb443b14f30-db-sync-config-data\") pod \"4d086846-e371-492b-81c2-0fb443b14f30\" (UID: \"4d086846-e371-492b-81c2-0fb443b14f30\") " Dec 01 20:18:19 crc kubenswrapper[4802]: I1201 20:18:19.997788 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d086846-e371-492b-81c2-0fb443b14f30-combined-ca-bundle\") pod \"4d086846-e371-492b-81c2-0fb443b14f30\" (UID: \"4d086846-e371-492b-81c2-0fb443b14f30\") " Dec 01 20:18:19 crc kubenswrapper[4802]: I1201 20:18:19.998030 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdqvc\" (UniqueName: \"kubernetes.io/projected/4d086846-e371-492b-81c2-0fb443b14f30-kube-api-access-bdqvc\") pod \"4d086846-e371-492b-81c2-0fb443b14f30\" (UID: \"4d086846-e371-492b-81c2-0fb443b14f30\") " Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.004985 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d086846-e371-492b-81c2-0fb443b14f30-kube-api-access-bdqvc" (OuterVolumeSpecName: "kube-api-access-bdqvc") pod "4d086846-e371-492b-81c2-0fb443b14f30" (UID: "4d086846-e371-492b-81c2-0fb443b14f30"). InnerVolumeSpecName "kube-api-access-bdqvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.006116 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d086846-e371-492b-81c2-0fb443b14f30-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4d086846-e371-492b-81c2-0fb443b14f30" (UID: "4d086846-e371-492b-81c2-0fb443b14f30"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.026540 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d086846-e371-492b-81c2-0fb443b14f30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d086846-e371-492b-81c2-0fb443b14f30" (UID: "4d086846-e371-492b-81c2-0fb443b14f30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.099352 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdqvc\" (UniqueName: \"kubernetes.io/projected/4d086846-e371-492b-81c2-0fb443b14f30-kube-api-access-bdqvc\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.099402 4802 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4d086846-e371-492b-81c2-0fb443b14f30-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.099415 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d086846-e371-492b-81c2-0fb443b14f30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.470057 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84c4b4b5d7-2ph8r" event={"ID":"86540934-a020-4a27-bfa6-62fbc7cfe412","Type":"ContainerStarted","Data":"08a0483f4ed2b989e784d5fcba267c4bebde64fc6c1aa32ab62a5c5c868a46ca"} Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.470104 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84c4b4b5d7-2ph8r" event={"ID":"86540934-a020-4a27-bfa6-62fbc7cfe412","Type":"ContainerStarted","Data":"ee1934558ec0d4226c346bcab19951cf87edb61f20f3075e3173e423b44bdbbb"} Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.471118 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-84c4b4b5d7-2ph8r" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.472479 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b30d2288-725d-4cdf-9297-ad9afee36691","Type":"ContainerStarted","Data":"03f0f77ab56bd49bcccd3a6e64a5040dbed45f4e39e0e2b55a09c49c313286ef"} Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.476671 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hkl2c" event={"ID":"4d086846-e371-492b-81c2-0fb443b14f30","Type":"ContainerDied","Data":"5f816cb09c113db6299ca74920a84bc729d3c840d7ad1f95cc461245426ae3da"} Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.476718 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f816cb09c113db6299ca74920a84bc729d3c840d7ad1f95cc461245426ae3da" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.476801 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hkl2c" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.515051 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-84c4b4b5d7-2ph8r" podStartSLOduration=2.515020807 podStartE2EDuration="2.515020807s" podCreationTimestamp="2025-12-01 20:18:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:18:20.503620109 +0000 UTC m=+1322.066179770" watchObservedRunningTime="2025-12-01 20:18:20.515020807 +0000 UTC m=+1322.077580448" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.785881 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6f8997c475-6j472"] Dec 01 20:18:20 crc kubenswrapper[4802]: E1201 20:18:20.791747 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d086846-e371-492b-81c2-0fb443b14f30" containerName="barbican-db-sync" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.791777 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d086846-e371-492b-81c2-0fb443b14f30" containerName="barbican-db-sync" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.791947 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d086846-e371-492b-81c2-0fb443b14f30" containerName="barbican-db-sync" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.792891 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6f8997c475-6j472" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.801913 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-ptbfk" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.803173 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.803624 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.840095 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7116c50f-a3ef-4975-9dca-2070fbdac59a-config-data-custom\") pod \"barbican-worker-6f8997c475-6j472\" (UID: \"7116c50f-a3ef-4975-9dca-2070fbdac59a\") " pod="openstack/barbican-worker-6f8997c475-6j472" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.840129 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7116c50f-a3ef-4975-9dca-2070fbdac59a-combined-ca-bundle\") pod \"barbican-worker-6f8997c475-6j472\" (UID: \"7116c50f-a3ef-4975-9dca-2070fbdac59a\") " pod="openstack/barbican-worker-6f8997c475-6j472" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.840168 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7116c50f-a3ef-4975-9dca-2070fbdac59a-logs\") pod \"barbican-worker-6f8997c475-6j472\" (UID: \"7116c50f-a3ef-4975-9dca-2070fbdac59a\") " pod="openstack/barbican-worker-6f8997c475-6j472" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.840387 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92wtv\" (UniqueName: \"kubernetes.io/projected/7116c50f-a3ef-4975-9dca-2070fbdac59a-kube-api-access-92wtv\") pod \"barbican-worker-6f8997c475-6j472\" (UID: \"7116c50f-a3ef-4975-9dca-2070fbdac59a\") " pod="openstack/barbican-worker-6f8997c475-6j472" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.840481 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7116c50f-a3ef-4975-9dca-2070fbdac59a-config-data\") pod \"barbican-worker-6f8997c475-6j472\" (UID: \"7116c50f-a3ef-4975-9dca-2070fbdac59a\") " pod="openstack/barbican-worker-6f8997c475-6j472" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.851255 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6f8997c475-6j472"] Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.861443 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-67fcb6786-rkbj5"] Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.863032 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-67fcb6786-rkbj5" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.867572 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.867929 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-67fcb6786-rkbj5"] Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.927353 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-4k6x7"] Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.928236 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b946d459c-4k6x7" podUID="01af2259-c8f1-4bd5-b68c-962053274ff7" containerName="dnsmasq-dns" containerID="cri-o://e0f92db194ba68a779e2b43b07484c19cb85dbf986170a76a06a129493155d95" gracePeriod=10 Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.942126 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9d802de-8a16-4fec-8768-b09841678cc8-logs\") pod \"barbican-keystone-listener-67fcb6786-rkbj5\" (UID: \"f9d802de-8a16-4fec-8768-b09841678cc8\") " pod="openstack/barbican-keystone-listener-67fcb6786-rkbj5" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.942217 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9d802de-8a16-4fec-8768-b09841678cc8-config-data-custom\") pod \"barbican-keystone-listener-67fcb6786-rkbj5\" (UID: \"f9d802de-8a16-4fec-8768-b09841678cc8\") " pod="openstack/barbican-keystone-listener-67fcb6786-rkbj5" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.942238 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9d802de-8a16-4fec-8768-b09841678cc8-combined-ca-bundle\") pod \"barbican-keystone-listener-67fcb6786-rkbj5\" (UID: \"f9d802de-8a16-4fec-8768-b09841678cc8\") " pod="openstack/barbican-keystone-listener-67fcb6786-rkbj5" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.942287 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92wtv\" (UniqueName: \"kubernetes.io/projected/7116c50f-a3ef-4975-9dca-2070fbdac59a-kube-api-access-92wtv\") pod \"barbican-worker-6f8997c475-6j472\" (UID: \"7116c50f-a3ef-4975-9dca-2070fbdac59a\") " pod="openstack/barbican-worker-6f8997c475-6j472" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.942303 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9d802de-8a16-4fec-8768-b09841678cc8-config-data\") pod \"barbican-keystone-listener-67fcb6786-rkbj5\" (UID: \"f9d802de-8a16-4fec-8768-b09841678cc8\") " pod="openstack/barbican-keystone-listener-67fcb6786-rkbj5" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.942333 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7116c50f-a3ef-4975-9dca-2070fbdac59a-config-data\") pod \"barbican-worker-6f8997c475-6j472\" (UID: \"7116c50f-a3ef-4975-9dca-2070fbdac59a\") " pod="openstack/barbican-worker-6f8997c475-6j472" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.942517 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmzk5\" (UniqueName: \"kubernetes.io/projected/f9d802de-8a16-4fec-8768-b09841678cc8-kube-api-access-cmzk5\") pod \"barbican-keystone-listener-67fcb6786-rkbj5\" (UID: \"f9d802de-8a16-4fec-8768-b09841678cc8\") " pod="openstack/barbican-keystone-listener-67fcb6786-rkbj5" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.942577 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7116c50f-a3ef-4975-9dca-2070fbdac59a-config-data-custom\") pod \"barbican-worker-6f8997c475-6j472\" (UID: \"7116c50f-a3ef-4975-9dca-2070fbdac59a\") " pod="openstack/barbican-worker-6f8997c475-6j472" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.942599 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7116c50f-a3ef-4975-9dca-2070fbdac59a-combined-ca-bundle\") pod \"barbican-worker-6f8997c475-6j472\" (UID: \"7116c50f-a3ef-4975-9dca-2070fbdac59a\") " pod="openstack/barbican-worker-6f8997c475-6j472" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.942694 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7116c50f-a3ef-4975-9dca-2070fbdac59a-logs\") pod \"barbican-worker-6f8997c475-6j472\" (UID: \"7116c50f-a3ef-4975-9dca-2070fbdac59a\") " pod="openstack/barbican-worker-6f8997c475-6j472" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.943098 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7116c50f-a3ef-4975-9dca-2070fbdac59a-logs\") pod \"barbican-worker-6f8997c475-6j472\" (UID: \"7116c50f-a3ef-4975-9dca-2070fbdac59a\") " pod="openstack/barbican-worker-6f8997c475-6j472" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.965483 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7116c50f-a3ef-4975-9dca-2070fbdac59a-config-data\") pod \"barbican-worker-6f8997c475-6j472\" (UID: \"7116c50f-a3ef-4975-9dca-2070fbdac59a\") " pod="openstack/barbican-worker-6f8997c475-6j472" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.994395 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7116c50f-a3ef-4975-9dca-2070fbdac59a-config-data-custom\") pod \"barbican-worker-6f8997c475-6j472\" (UID: \"7116c50f-a3ef-4975-9dca-2070fbdac59a\") " pod="openstack/barbican-worker-6f8997c475-6j472" Dec 01 20:18:20 crc kubenswrapper[4802]: I1201 20:18:20.994466 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-wpcpj"] Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.005239 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-wpcpj" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.023001 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7116c50f-a3ef-4975-9dca-2070fbdac59a-combined-ca-bundle\") pod \"barbican-worker-6f8997c475-6j472\" (UID: \"7116c50f-a3ef-4975-9dca-2070fbdac59a\") " pod="openstack/barbican-worker-6f8997c475-6j472" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.028009 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-wpcpj"] Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.047847 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92wtv\" (UniqueName: \"kubernetes.io/projected/7116c50f-a3ef-4975-9dca-2070fbdac59a-kube-api-access-92wtv\") pod \"barbican-worker-6f8997c475-6j472\" (UID: \"7116c50f-a3ef-4975-9dca-2070fbdac59a\") " pod="openstack/barbican-worker-6f8997c475-6j472" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.048238 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9d802de-8a16-4fec-8768-b09841678cc8-config-data\") pod \"barbican-keystone-listener-67fcb6786-rkbj5\" (UID: \"f9d802de-8a16-4fec-8768-b09841678cc8\") " pod="openstack/barbican-keystone-listener-67fcb6786-rkbj5" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.048316 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a3aac83-f91e-498d-a920-f446126e6aec-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-wpcpj\" (UID: \"5a3aac83-f91e-498d-a920-f446126e6aec\") " pod="openstack/dnsmasq-dns-6bb684768f-wpcpj" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.048348 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a3aac83-f91e-498d-a920-f446126e6aec-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-wpcpj\" (UID: \"5a3aac83-f91e-498d-a920-f446126e6aec\") " pod="openstack/dnsmasq-dns-6bb684768f-wpcpj" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.048370 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a3aac83-f91e-498d-a920-f446126e6aec-config\") pod \"dnsmasq-dns-6bb684768f-wpcpj\" (UID: \"5a3aac83-f91e-498d-a920-f446126e6aec\") " pod="openstack/dnsmasq-dns-6bb684768f-wpcpj" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.048394 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmzk5\" (UniqueName: \"kubernetes.io/projected/f9d802de-8a16-4fec-8768-b09841678cc8-kube-api-access-cmzk5\") pod \"barbican-keystone-listener-67fcb6786-rkbj5\" (UID: \"f9d802de-8a16-4fec-8768-b09841678cc8\") " pod="openstack/barbican-keystone-listener-67fcb6786-rkbj5" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.048453 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a3aac83-f91e-498d-a920-f446126e6aec-dns-svc\") pod \"dnsmasq-dns-6bb684768f-wpcpj\" (UID: \"5a3aac83-f91e-498d-a920-f446126e6aec\") " pod="openstack/dnsmasq-dns-6bb684768f-wpcpj" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.048487 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9d802de-8a16-4fec-8768-b09841678cc8-logs\") pod \"barbican-keystone-listener-67fcb6786-rkbj5\" (UID: \"f9d802de-8a16-4fec-8768-b09841678cc8\") " pod="openstack/barbican-keystone-listener-67fcb6786-rkbj5" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.048540 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw8c4\" (UniqueName: \"kubernetes.io/projected/5a3aac83-f91e-498d-a920-f446126e6aec-kube-api-access-mw8c4\") pod \"dnsmasq-dns-6bb684768f-wpcpj\" (UID: \"5a3aac83-f91e-498d-a920-f446126e6aec\") " pod="openstack/dnsmasq-dns-6bb684768f-wpcpj" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.048564 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9d802de-8a16-4fec-8768-b09841678cc8-config-data-custom\") pod \"barbican-keystone-listener-67fcb6786-rkbj5\" (UID: \"f9d802de-8a16-4fec-8768-b09841678cc8\") " pod="openstack/barbican-keystone-listener-67fcb6786-rkbj5" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.048585 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9d802de-8a16-4fec-8768-b09841678cc8-combined-ca-bundle\") pod \"barbican-keystone-listener-67fcb6786-rkbj5\" (UID: \"f9d802de-8a16-4fec-8768-b09841678cc8\") " pod="openstack/barbican-keystone-listener-67fcb6786-rkbj5" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.054864 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9d802de-8a16-4fec-8768-b09841678cc8-logs\") pod \"barbican-keystone-listener-67fcb6786-rkbj5\" (UID: \"f9d802de-8a16-4fec-8768-b09841678cc8\") " pod="openstack/barbican-keystone-listener-67fcb6786-rkbj5" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.077245 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9d802de-8a16-4fec-8768-b09841678cc8-config-data\") pod \"barbican-keystone-listener-67fcb6786-rkbj5\" (UID: \"f9d802de-8a16-4fec-8768-b09841678cc8\") " pod="openstack/barbican-keystone-listener-67fcb6786-rkbj5" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.084920 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmzk5\" (UniqueName: \"kubernetes.io/projected/f9d802de-8a16-4fec-8768-b09841678cc8-kube-api-access-cmzk5\") pod \"barbican-keystone-listener-67fcb6786-rkbj5\" (UID: \"f9d802de-8a16-4fec-8768-b09841678cc8\") " pod="openstack/barbican-keystone-listener-67fcb6786-rkbj5" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.088798 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9d802de-8a16-4fec-8768-b09841678cc8-combined-ca-bundle\") pod \"barbican-keystone-listener-67fcb6786-rkbj5\" (UID: \"f9d802de-8a16-4fec-8768-b09841678cc8\") " pod="openstack/barbican-keystone-listener-67fcb6786-rkbj5" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.097961 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9d802de-8a16-4fec-8768-b09841678cc8-config-data-custom\") pod \"barbican-keystone-listener-67fcb6786-rkbj5\" (UID: \"f9d802de-8a16-4fec-8768-b09841678cc8\") " pod="openstack/barbican-keystone-listener-67fcb6786-rkbj5" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.146620 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6f8997c475-6j472" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.152349 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a3aac83-f91e-498d-a920-f446126e6aec-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-wpcpj\" (UID: \"5a3aac83-f91e-498d-a920-f446126e6aec\") " pod="openstack/dnsmasq-dns-6bb684768f-wpcpj" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.152403 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a3aac83-f91e-498d-a920-f446126e6aec-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-wpcpj\" (UID: \"5a3aac83-f91e-498d-a920-f446126e6aec\") " pod="openstack/dnsmasq-dns-6bb684768f-wpcpj" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.152431 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a3aac83-f91e-498d-a920-f446126e6aec-config\") pod \"dnsmasq-dns-6bb684768f-wpcpj\" (UID: \"5a3aac83-f91e-498d-a920-f446126e6aec\") " pod="openstack/dnsmasq-dns-6bb684768f-wpcpj" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.152486 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a3aac83-f91e-498d-a920-f446126e6aec-dns-svc\") pod \"dnsmasq-dns-6bb684768f-wpcpj\" (UID: \"5a3aac83-f91e-498d-a920-f446126e6aec\") " pod="openstack/dnsmasq-dns-6bb684768f-wpcpj" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.152552 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw8c4\" (UniqueName: \"kubernetes.io/projected/5a3aac83-f91e-498d-a920-f446126e6aec-kube-api-access-mw8c4\") pod \"dnsmasq-dns-6bb684768f-wpcpj\" (UID: \"5a3aac83-f91e-498d-a920-f446126e6aec\") " pod="openstack/dnsmasq-dns-6bb684768f-wpcpj" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.153958 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a3aac83-f91e-498d-a920-f446126e6aec-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-wpcpj\" (UID: \"5a3aac83-f91e-498d-a920-f446126e6aec\") " pod="openstack/dnsmasq-dns-6bb684768f-wpcpj" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.154659 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a3aac83-f91e-498d-a920-f446126e6aec-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-wpcpj\" (UID: \"5a3aac83-f91e-498d-a920-f446126e6aec\") " pod="openstack/dnsmasq-dns-6bb684768f-wpcpj" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.155409 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a3aac83-f91e-498d-a920-f446126e6aec-config\") pod \"dnsmasq-dns-6bb684768f-wpcpj\" (UID: \"5a3aac83-f91e-498d-a920-f446126e6aec\") " pod="openstack/dnsmasq-dns-6bb684768f-wpcpj" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.156083 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a3aac83-f91e-498d-a920-f446126e6aec-dns-svc\") pod \"dnsmasq-dns-6bb684768f-wpcpj\" (UID: \"5a3aac83-f91e-498d-a920-f446126e6aec\") " pod="openstack/dnsmasq-dns-6bb684768f-wpcpj" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.257933 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-67fcb6786-rkbj5" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.290075 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw8c4\" (UniqueName: \"kubernetes.io/projected/5a3aac83-f91e-498d-a920-f446126e6aec-kube-api-access-mw8c4\") pod \"dnsmasq-dns-6bb684768f-wpcpj\" (UID: \"5a3aac83-f91e-498d-a920-f446126e6aec\") " pod="openstack/dnsmasq-dns-6bb684768f-wpcpj" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.331379 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-86bd8cd97b-zmmcw"] Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.332892 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86bd8cd97b-zmmcw" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.339051 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.379412 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-wpcpj" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.383326 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-86bd8cd97b-zmmcw"] Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.463447 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a305133e-e548-465e-bee2-abc86b1e8fe2-logs\") pod \"barbican-api-86bd8cd97b-zmmcw\" (UID: \"a305133e-e548-465e-bee2-abc86b1e8fe2\") " pod="openstack/barbican-api-86bd8cd97b-zmmcw" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.463497 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spzc5\" (UniqueName: \"kubernetes.io/projected/a305133e-e548-465e-bee2-abc86b1e8fe2-kube-api-access-spzc5\") pod \"barbican-api-86bd8cd97b-zmmcw\" (UID: \"a305133e-e548-465e-bee2-abc86b1e8fe2\") " pod="openstack/barbican-api-86bd8cd97b-zmmcw" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.463523 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a305133e-e548-465e-bee2-abc86b1e8fe2-config-data\") pod \"barbican-api-86bd8cd97b-zmmcw\" (UID: \"a305133e-e548-465e-bee2-abc86b1e8fe2\") " pod="openstack/barbican-api-86bd8cd97b-zmmcw" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.463542 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a305133e-e548-465e-bee2-abc86b1e8fe2-config-data-custom\") pod \"barbican-api-86bd8cd97b-zmmcw\" (UID: \"a305133e-e548-465e-bee2-abc86b1e8fe2\") " pod="openstack/barbican-api-86bd8cd97b-zmmcw" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.463568 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a305133e-e548-465e-bee2-abc86b1e8fe2-combined-ca-bundle\") pod \"barbican-api-86bd8cd97b-zmmcw\" (UID: \"a305133e-e548-465e-bee2-abc86b1e8fe2\") " pod="openstack/barbican-api-86bd8cd97b-zmmcw" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.500174 4802 generic.go:334] "Generic (PLEG): container finished" podID="01af2259-c8f1-4bd5-b68c-962053274ff7" containerID="e0f92db194ba68a779e2b43b07484c19cb85dbf986170a76a06a129493155d95" exitCode=0 Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.500323 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-4k6x7" event={"ID":"01af2259-c8f1-4bd5-b68c-962053274ff7","Type":"ContainerDied","Data":"e0f92db194ba68a779e2b43b07484c19cb85dbf986170a76a06a129493155d95"} Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.568095 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a305133e-e548-465e-bee2-abc86b1e8fe2-logs\") pod \"barbican-api-86bd8cd97b-zmmcw\" (UID: \"a305133e-e548-465e-bee2-abc86b1e8fe2\") " pod="openstack/barbican-api-86bd8cd97b-zmmcw" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.569868 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a305133e-e548-465e-bee2-abc86b1e8fe2-logs\") pod \"barbican-api-86bd8cd97b-zmmcw\" (UID: \"a305133e-e548-465e-bee2-abc86b1e8fe2\") " pod="openstack/barbican-api-86bd8cd97b-zmmcw" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.579209 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spzc5\" (UniqueName: \"kubernetes.io/projected/a305133e-e548-465e-bee2-abc86b1e8fe2-kube-api-access-spzc5\") pod \"barbican-api-86bd8cd97b-zmmcw\" (UID: \"a305133e-e548-465e-bee2-abc86b1e8fe2\") " pod="openstack/barbican-api-86bd8cd97b-zmmcw" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.579302 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a305133e-e548-465e-bee2-abc86b1e8fe2-config-data\") pod \"barbican-api-86bd8cd97b-zmmcw\" (UID: \"a305133e-e548-465e-bee2-abc86b1e8fe2\") " pod="openstack/barbican-api-86bd8cd97b-zmmcw" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.579324 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a305133e-e548-465e-bee2-abc86b1e8fe2-config-data-custom\") pod \"barbican-api-86bd8cd97b-zmmcw\" (UID: \"a305133e-e548-465e-bee2-abc86b1e8fe2\") " pod="openstack/barbican-api-86bd8cd97b-zmmcw" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.579407 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a305133e-e548-465e-bee2-abc86b1e8fe2-combined-ca-bundle\") pod \"barbican-api-86bd8cd97b-zmmcw\" (UID: \"a305133e-e548-465e-bee2-abc86b1e8fe2\") " pod="openstack/barbican-api-86bd8cd97b-zmmcw" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.586573 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a305133e-e548-465e-bee2-abc86b1e8fe2-combined-ca-bundle\") pod \"barbican-api-86bd8cd97b-zmmcw\" (UID: \"a305133e-e548-465e-bee2-abc86b1e8fe2\") " pod="openstack/barbican-api-86bd8cd97b-zmmcw" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.588469 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a305133e-e548-465e-bee2-abc86b1e8fe2-config-data\") pod \"barbican-api-86bd8cd97b-zmmcw\" (UID: \"a305133e-e548-465e-bee2-abc86b1e8fe2\") " pod="openstack/barbican-api-86bd8cd97b-zmmcw" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.599881 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a305133e-e548-465e-bee2-abc86b1e8fe2-config-data-custom\") pod \"barbican-api-86bd8cd97b-zmmcw\" (UID: \"a305133e-e548-465e-bee2-abc86b1e8fe2\") " pod="openstack/barbican-api-86bd8cd97b-zmmcw" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.601122 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spzc5\" (UniqueName: \"kubernetes.io/projected/a305133e-e548-465e-bee2-abc86b1e8fe2-kube-api-access-spzc5\") pod \"barbican-api-86bd8cd97b-zmmcw\" (UID: \"a305133e-e548-465e-bee2-abc86b1e8fe2\") " pod="openstack/barbican-api-86bd8cd97b-zmmcw" Dec 01 20:18:21 crc kubenswrapper[4802]: I1201 20:18:21.722182 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86bd8cd97b-zmmcw" Dec 01 20:18:22 crc kubenswrapper[4802]: I1201 20:18:22.000940 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-wpcpj"] Dec 01 20:18:22 crc kubenswrapper[4802]: W1201 20:18:22.012519 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a3aac83_f91e_498d_a920_f446126e6aec.slice/crio-e316eca1171da737ac5975e193425c80b559e7827aae3bb022ab9b4db101ceb9 WatchSource:0}: Error finding container e316eca1171da737ac5975e193425c80b559e7827aae3bb022ab9b4db101ceb9: Status 404 returned error can't find the container with id e316eca1171da737ac5975e193425c80b559e7827aae3bb022ab9b4db101ceb9 Dec 01 20:18:22 crc kubenswrapper[4802]: I1201 20:18:22.095742 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6f8997c475-6j472"] Dec 01 20:18:22 crc kubenswrapper[4802]: I1201 20:18:22.172216 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-67fcb6786-rkbj5"] Dec 01 20:18:22 crc kubenswrapper[4802]: W1201 20:18:22.175881 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9d802de_8a16_4fec_8768_b09841678cc8.slice/crio-6597c3d28911c61a49378b04d3cacd551710177c444cda19ac4af24f768e7146 WatchSource:0}: Error finding container 6597c3d28911c61a49378b04d3cacd551710177c444cda19ac4af24f768e7146: Status 404 returned error can't find the container with id 6597c3d28911c61a49378b04d3cacd551710177c444cda19ac4af24f768e7146 Dec 01 20:18:22 crc kubenswrapper[4802]: I1201 20:18:22.314752 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-86bd8cd97b-zmmcw"] Dec 01 20:18:22 crc kubenswrapper[4802]: W1201 20:18:22.321076 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda305133e_e548_465e_bee2_abc86b1e8fe2.slice/crio-9396a50b84792397d3f71a68ce1b718911a8552eabf58b11d4a0471baacb3780 WatchSource:0}: Error finding container 9396a50b84792397d3f71a68ce1b718911a8552eabf58b11d4a0471baacb3780: Status 404 returned error can't find the container with id 9396a50b84792397d3f71a68ce1b718911a8552eabf58b11d4a0471baacb3780 Dec 01 20:18:22 crc kubenswrapper[4802]: I1201 20:18:22.523165 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b30d2288-725d-4cdf-9297-ad9afee36691","Type":"ContainerStarted","Data":"71ac1ff2a4257fa2f29d0b8aa78af67bf6d8eaf62e1e74a89dfd306d253005bd"} Dec 01 20:18:22 crc kubenswrapper[4802]: I1201 20:18:22.524101 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-67fcb6786-rkbj5" event={"ID":"f9d802de-8a16-4fec-8768-b09841678cc8","Type":"ContainerStarted","Data":"6597c3d28911c61a49378b04d3cacd551710177c444cda19ac4af24f768e7146"} Dec 01 20:18:22 crc kubenswrapper[4802]: I1201 20:18:22.525734 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86bd8cd97b-zmmcw" event={"ID":"a305133e-e548-465e-bee2-abc86b1e8fe2","Type":"ContainerStarted","Data":"c563583f3577044aaef5e23ce677aae73aa47ec60b8eb9fcbbe714c39dbb2a7c"} Dec 01 20:18:22 crc kubenswrapper[4802]: I1201 20:18:22.525761 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86bd8cd97b-zmmcw" event={"ID":"a305133e-e548-465e-bee2-abc86b1e8fe2","Type":"ContainerStarted","Data":"9396a50b84792397d3f71a68ce1b718911a8552eabf58b11d4a0471baacb3780"} Dec 01 20:18:22 crc kubenswrapper[4802]: I1201 20:18:22.537156 4802 generic.go:334] "Generic (PLEG): container finished" podID="5a3aac83-f91e-498d-a920-f446126e6aec" containerID="ec26a21884d8373f7cf1f6f0c5076f574dd4375ddc5882343771e9d89ed4b5d7" exitCode=0 Dec 01 20:18:22 crc kubenswrapper[4802]: I1201 20:18:22.537276 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-wpcpj" event={"ID":"5a3aac83-f91e-498d-a920-f446126e6aec","Type":"ContainerDied","Data":"ec26a21884d8373f7cf1f6f0c5076f574dd4375ddc5882343771e9d89ed4b5d7"} Dec 01 20:18:22 crc kubenswrapper[4802]: I1201 20:18:22.537322 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-wpcpj" event={"ID":"5a3aac83-f91e-498d-a920-f446126e6aec","Type":"ContainerStarted","Data":"e316eca1171da737ac5975e193425c80b559e7827aae3bb022ab9b4db101ceb9"} Dec 01 20:18:22 crc kubenswrapper[4802]: I1201 20:18:22.541123 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6f8997c475-6j472" event={"ID":"7116c50f-a3ef-4975-9dca-2070fbdac59a","Type":"ContainerStarted","Data":"486505f3537be8ccb59b5e4f4c558f988c80b8b4a84581d182f86bc04f519430"} Dec 01 20:18:22 crc kubenswrapper[4802]: I1201 20:18:22.920621 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-4k6x7" Dec 01 20:18:22 crc kubenswrapper[4802]: I1201 20:18:22.947615 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01af2259-c8f1-4bd5-b68c-962053274ff7-ovsdbserver-nb\") pod \"01af2259-c8f1-4bd5-b68c-962053274ff7\" (UID: \"01af2259-c8f1-4bd5-b68c-962053274ff7\") " Dec 01 20:18:22 crc kubenswrapper[4802]: I1201 20:18:22.947692 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgqk7\" (UniqueName: \"kubernetes.io/projected/01af2259-c8f1-4bd5-b68c-962053274ff7-kube-api-access-bgqk7\") pod \"01af2259-c8f1-4bd5-b68c-962053274ff7\" (UID: \"01af2259-c8f1-4bd5-b68c-962053274ff7\") " Dec 01 20:18:22 crc kubenswrapper[4802]: I1201 20:18:22.947738 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01af2259-c8f1-4bd5-b68c-962053274ff7-dns-svc\") pod \"01af2259-c8f1-4bd5-b68c-962053274ff7\" (UID: \"01af2259-c8f1-4bd5-b68c-962053274ff7\") " Dec 01 20:18:22 crc kubenswrapper[4802]: I1201 20:18:22.947973 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01af2259-c8f1-4bd5-b68c-962053274ff7-ovsdbserver-sb\") pod \"01af2259-c8f1-4bd5-b68c-962053274ff7\" (UID: \"01af2259-c8f1-4bd5-b68c-962053274ff7\") " Dec 01 20:18:22 crc kubenswrapper[4802]: I1201 20:18:22.948046 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01af2259-c8f1-4bd5-b68c-962053274ff7-config\") pod \"01af2259-c8f1-4bd5-b68c-962053274ff7\" (UID: \"01af2259-c8f1-4bd5-b68c-962053274ff7\") " Dec 01 20:18:22 crc kubenswrapper[4802]: I1201 20:18:22.958632 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01af2259-c8f1-4bd5-b68c-962053274ff7-kube-api-access-bgqk7" (OuterVolumeSpecName: "kube-api-access-bgqk7") pod "01af2259-c8f1-4bd5-b68c-962053274ff7" (UID: "01af2259-c8f1-4bd5-b68c-962053274ff7"). InnerVolumeSpecName "kube-api-access-bgqk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:18:23 crc kubenswrapper[4802]: I1201 20:18:23.020027 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01af2259-c8f1-4bd5-b68c-962053274ff7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "01af2259-c8f1-4bd5-b68c-962053274ff7" (UID: "01af2259-c8f1-4bd5-b68c-962053274ff7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:18:23 crc kubenswrapper[4802]: I1201 20:18:23.033846 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01af2259-c8f1-4bd5-b68c-962053274ff7-config" (OuterVolumeSpecName: "config") pod "01af2259-c8f1-4bd5-b68c-962053274ff7" (UID: "01af2259-c8f1-4bd5-b68c-962053274ff7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:18:23 crc kubenswrapper[4802]: I1201 20:18:23.036729 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01af2259-c8f1-4bd5-b68c-962053274ff7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "01af2259-c8f1-4bd5-b68c-962053274ff7" (UID: "01af2259-c8f1-4bd5-b68c-962053274ff7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:18:23 crc kubenswrapper[4802]: I1201 20:18:23.050829 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01af2259-c8f1-4bd5-b68c-962053274ff7-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:23 crc kubenswrapper[4802]: I1201 20:18:23.050869 4802 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01af2259-c8f1-4bd5-b68c-962053274ff7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:23 crc kubenswrapper[4802]: I1201 20:18:23.050881 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgqk7\" (UniqueName: \"kubernetes.io/projected/01af2259-c8f1-4bd5-b68c-962053274ff7-kube-api-access-bgqk7\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:23 crc kubenswrapper[4802]: I1201 20:18:23.050891 4802 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01af2259-c8f1-4bd5-b68c-962053274ff7-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:23 crc kubenswrapper[4802]: I1201 20:18:23.068563 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01af2259-c8f1-4bd5-b68c-962053274ff7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "01af2259-c8f1-4bd5-b68c-962053274ff7" (UID: "01af2259-c8f1-4bd5-b68c-962053274ff7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:18:23 crc kubenswrapper[4802]: I1201 20:18:23.152050 4802 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01af2259-c8f1-4bd5-b68c-962053274ff7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:23 crc kubenswrapper[4802]: I1201 20:18:23.553497 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-4k6x7" event={"ID":"01af2259-c8f1-4bd5-b68c-962053274ff7","Type":"ContainerDied","Data":"e78f04afd440bf1a467ca11b942f0d98b33d870cd250bf8c5fc9bb0a2d2152ea"} Dec 01 20:18:23 crc kubenswrapper[4802]: I1201 20:18:23.553564 4802 scope.go:117] "RemoveContainer" containerID="e0f92db194ba68a779e2b43b07484c19cb85dbf986170a76a06a129493155d95" Dec 01 20:18:23 crc kubenswrapper[4802]: I1201 20:18:23.553563 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-4k6x7" Dec 01 20:18:23 crc kubenswrapper[4802]: I1201 20:18:23.563177 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86bd8cd97b-zmmcw" event={"ID":"a305133e-e548-465e-bee2-abc86b1e8fe2","Type":"ContainerStarted","Data":"cab1e6c7bc904080db8d1ef931dfb03d80e3ee29cfd8b1872627111484f2e6e8"} Dec 01 20:18:23 crc kubenswrapper[4802]: I1201 20:18:23.564514 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-86bd8cd97b-zmmcw" Dec 01 20:18:23 crc kubenswrapper[4802]: I1201 20:18:23.564547 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-86bd8cd97b-zmmcw" Dec 01 20:18:23 crc kubenswrapper[4802]: I1201 20:18:23.595170 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-86bd8cd97b-zmmcw" podStartSLOduration=2.595152553 podStartE2EDuration="2.595152553s" podCreationTimestamp="2025-12-01 20:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:18:23.593063957 +0000 UTC m=+1325.155623598" watchObservedRunningTime="2025-12-01 20:18:23.595152553 +0000 UTC m=+1325.157712194" Dec 01 20:18:23 crc kubenswrapper[4802]: I1201 20:18:23.599057 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-wpcpj" event={"ID":"5a3aac83-f91e-498d-a920-f446126e6aec","Type":"ContainerStarted","Data":"1fde4a7df9507c3fbdcd3d68fbfead855de14ba9fd927c7670028ff67af6cf5b"} Dec 01 20:18:23 crc kubenswrapper[4802]: I1201 20:18:23.599378 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb684768f-wpcpj" Dec 01 20:18:23 crc kubenswrapper[4802]: I1201 20:18:23.645117 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb684768f-wpcpj" podStartSLOduration=3.645093891 podStartE2EDuration="3.645093891s" podCreationTimestamp="2025-12-01 20:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:18:23.632112563 +0000 UTC m=+1325.194672204" watchObservedRunningTime="2025-12-01 20:18:23.645093891 +0000 UTC m=+1325.207653532" Dec 01 20:18:23 crc kubenswrapper[4802]: I1201 20:18:23.674565 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-4k6x7"] Dec 01 20:18:23 crc kubenswrapper[4802]: I1201 20:18:23.688269 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-4k6x7"] Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.350856 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-69b4dccd58-q9lk2"] Dec 01 20:18:24 crc kubenswrapper[4802]: E1201 20:18:24.351387 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01af2259-c8f1-4bd5-b68c-962053274ff7" containerName="dnsmasq-dns" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.351405 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="01af2259-c8f1-4bd5-b68c-962053274ff7" containerName="dnsmasq-dns" Dec 01 20:18:24 crc kubenswrapper[4802]: E1201 20:18:24.351429 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01af2259-c8f1-4bd5-b68c-962053274ff7" containerName="init" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.351438 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="01af2259-c8f1-4bd5-b68c-962053274ff7" containerName="init" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.351635 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="01af2259-c8f1-4bd5-b68c-962053274ff7" containerName="dnsmasq-dns" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.352794 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69b4dccd58-q9lk2" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.356642 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.357542 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.374904 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-69b4dccd58-q9lk2"] Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.486257 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b86c57b-7125-4ead-88b7-7f5998651f39-config-data-custom\") pod \"barbican-api-69b4dccd58-q9lk2\" (UID: \"2b86c57b-7125-4ead-88b7-7f5998651f39\") " pod="openstack/barbican-api-69b4dccd58-q9lk2" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.486327 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b86c57b-7125-4ead-88b7-7f5998651f39-public-tls-certs\") pod \"barbican-api-69b4dccd58-q9lk2\" (UID: \"2b86c57b-7125-4ead-88b7-7f5998651f39\") " pod="openstack/barbican-api-69b4dccd58-q9lk2" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.486357 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b86c57b-7125-4ead-88b7-7f5998651f39-config-data\") pod \"barbican-api-69b4dccd58-q9lk2\" (UID: \"2b86c57b-7125-4ead-88b7-7f5998651f39\") " pod="openstack/barbican-api-69b4dccd58-q9lk2" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.486402 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b86c57b-7125-4ead-88b7-7f5998651f39-internal-tls-certs\") pod \"barbican-api-69b4dccd58-q9lk2\" (UID: \"2b86c57b-7125-4ead-88b7-7f5998651f39\") " pod="openstack/barbican-api-69b4dccd58-q9lk2" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.486476 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b86c57b-7125-4ead-88b7-7f5998651f39-logs\") pod \"barbican-api-69b4dccd58-q9lk2\" (UID: \"2b86c57b-7125-4ead-88b7-7f5998651f39\") " pod="openstack/barbican-api-69b4dccd58-q9lk2" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.486508 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtccs\" (UniqueName: \"kubernetes.io/projected/2b86c57b-7125-4ead-88b7-7f5998651f39-kube-api-access-xtccs\") pod \"barbican-api-69b4dccd58-q9lk2\" (UID: \"2b86c57b-7125-4ead-88b7-7f5998651f39\") " pod="openstack/barbican-api-69b4dccd58-q9lk2" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.486558 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b86c57b-7125-4ead-88b7-7f5998651f39-combined-ca-bundle\") pod \"barbican-api-69b4dccd58-q9lk2\" (UID: \"2b86c57b-7125-4ead-88b7-7f5998651f39\") " pod="openstack/barbican-api-69b4dccd58-q9lk2" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.565945 4802 scope.go:117] "RemoveContainer" containerID="acae116baa8006de268f6e9687ff7dc4e6d4856eb02885df367a94ca47216330" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.587689 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b86c57b-7125-4ead-88b7-7f5998651f39-logs\") pod \"barbican-api-69b4dccd58-q9lk2\" (UID: \"2b86c57b-7125-4ead-88b7-7f5998651f39\") " pod="openstack/barbican-api-69b4dccd58-q9lk2" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.587758 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtccs\" (UniqueName: \"kubernetes.io/projected/2b86c57b-7125-4ead-88b7-7f5998651f39-kube-api-access-xtccs\") pod \"barbican-api-69b4dccd58-q9lk2\" (UID: \"2b86c57b-7125-4ead-88b7-7f5998651f39\") " pod="openstack/barbican-api-69b4dccd58-q9lk2" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.587817 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b86c57b-7125-4ead-88b7-7f5998651f39-combined-ca-bundle\") pod \"barbican-api-69b4dccd58-q9lk2\" (UID: \"2b86c57b-7125-4ead-88b7-7f5998651f39\") " pod="openstack/barbican-api-69b4dccd58-q9lk2" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.587867 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b86c57b-7125-4ead-88b7-7f5998651f39-config-data-custom\") pod \"barbican-api-69b4dccd58-q9lk2\" (UID: \"2b86c57b-7125-4ead-88b7-7f5998651f39\") " pod="openstack/barbican-api-69b4dccd58-q9lk2" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.587903 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b86c57b-7125-4ead-88b7-7f5998651f39-public-tls-certs\") pod \"barbican-api-69b4dccd58-q9lk2\" (UID: \"2b86c57b-7125-4ead-88b7-7f5998651f39\") " pod="openstack/barbican-api-69b4dccd58-q9lk2" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.587929 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b86c57b-7125-4ead-88b7-7f5998651f39-config-data\") pod \"barbican-api-69b4dccd58-q9lk2\" (UID: \"2b86c57b-7125-4ead-88b7-7f5998651f39\") " pod="openstack/barbican-api-69b4dccd58-q9lk2" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.587970 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b86c57b-7125-4ead-88b7-7f5998651f39-internal-tls-certs\") pod \"barbican-api-69b4dccd58-q9lk2\" (UID: \"2b86c57b-7125-4ead-88b7-7f5998651f39\") " pod="openstack/barbican-api-69b4dccd58-q9lk2" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.588317 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b86c57b-7125-4ead-88b7-7f5998651f39-logs\") pod \"barbican-api-69b4dccd58-q9lk2\" (UID: \"2b86c57b-7125-4ead-88b7-7f5998651f39\") " pod="openstack/barbican-api-69b4dccd58-q9lk2" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.595506 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b86c57b-7125-4ead-88b7-7f5998651f39-config-data-custom\") pod \"barbican-api-69b4dccd58-q9lk2\" (UID: \"2b86c57b-7125-4ead-88b7-7f5998651f39\") " pod="openstack/barbican-api-69b4dccd58-q9lk2" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.596048 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b86c57b-7125-4ead-88b7-7f5998651f39-combined-ca-bundle\") pod \"barbican-api-69b4dccd58-q9lk2\" (UID: \"2b86c57b-7125-4ead-88b7-7f5998651f39\") " pod="openstack/barbican-api-69b4dccd58-q9lk2" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.596880 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b86c57b-7125-4ead-88b7-7f5998651f39-config-data\") pod \"barbican-api-69b4dccd58-q9lk2\" (UID: \"2b86c57b-7125-4ead-88b7-7f5998651f39\") " pod="openstack/barbican-api-69b4dccd58-q9lk2" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.597281 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b86c57b-7125-4ead-88b7-7f5998651f39-internal-tls-certs\") pod \"barbican-api-69b4dccd58-q9lk2\" (UID: \"2b86c57b-7125-4ead-88b7-7f5998651f39\") " pod="openstack/barbican-api-69b4dccd58-q9lk2" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.597818 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b86c57b-7125-4ead-88b7-7f5998651f39-public-tls-certs\") pod \"barbican-api-69b4dccd58-q9lk2\" (UID: \"2b86c57b-7125-4ead-88b7-7f5998651f39\") " pod="openstack/barbican-api-69b4dccd58-q9lk2" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.608543 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtccs\" (UniqueName: \"kubernetes.io/projected/2b86c57b-7125-4ead-88b7-7f5998651f39-kube-api-access-xtccs\") pod \"barbican-api-69b4dccd58-q9lk2\" (UID: \"2b86c57b-7125-4ead-88b7-7f5998651f39\") " pod="openstack/barbican-api-69b4dccd58-q9lk2" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.685533 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69b4dccd58-q9lk2" Dec 01 20:18:24 crc kubenswrapper[4802]: I1201 20:18:24.747390 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01af2259-c8f1-4bd5-b68c-962053274ff7" path="/var/lib/kubelet/pods/01af2259-c8f1-4bd5-b68c-962053274ff7/volumes" Dec 01 20:18:25 crc kubenswrapper[4802]: I1201 20:18:25.188616 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-69b4dccd58-q9lk2"] Dec 01 20:18:25 crc kubenswrapper[4802]: W1201 20:18:25.190500 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b86c57b_7125_4ead_88b7_7f5998651f39.slice/crio-52fdf3dc546d32107a566f84800dca4918660f66aa39cd3290d050eafe99c406 WatchSource:0}: Error finding container 52fdf3dc546d32107a566f84800dca4918660f66aa39cd3290d050eafe99c406: Status 404 returned error can't find the container with id 52fdf3dc546d32107a566f84800dca4918660f66aa39cd3290d050eafe99c406 Dec 01 20:18:25 crc kubenswrapper[4802]: I1201 20:18:25.618548 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b30d2288-725d-4cdf-9297-ad9afee36691","Type":"ContainerStarted","Data":"8c2b523f8f216da1a8d6ae97afe906bc0c00bcca6a8d031e6b4f181b0da6d9e7"} Dec 01 20:18:25 crc kubenswrapper[4802]: I1201 20:18:25.620105 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-67fcb6786-rkbj5" event={"ID":"f9d802de-8a16-4fec-8768-b09841678cc8","Type":"ContainerStarted","Data":"760d9f07a39b701ba864d6d33b7fc4f6d917cadf10b2076a265d37f7f075b1b4"} Dec 01 20:18:25 crc kubenswrapper[4802]: I1201 20:18:25.620134 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-67fcb6786-rkbj5" event={"ID":"f9d802de-8a16-4fec-8768-b09841678cc8","Type":"ContainerStarted","Data":"1a57e4f4d1b373b231bd31b0d966c1113cfa7b8641e42ab3633515140e9e8e6b"} Dec 01 20:18:25 crc kubenswrapper[4802]: I1201 20:18:25.622462 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6f8997c475-6j472" event={"ID":"7116c50f-a3ef-4975-9dca-2070fbdac59a","Type":"ContainerStarted","Data":"766cb438ca15fc2bea9f1c4460af4a27ab10d35a5cb94e46806b32ef6bcf5304"} Dec 01 20:18:25 crc kubenswrapper[4802]: I1201 20:18:25.622497 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6f8997c475-6j472" event={"ID":"7116c50f-a3ef-4975-9dca-2070fbdac59a","Type":"ContainerStarted","Data":"f359f42140b15323b888ce70fe819bea58d1a3391cac107f1f9ea9f6dd4fc19e"} Dec 01 20:18:25 crc kubenswrapper[4802]: I1201 20:18:25.625830 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69b4dccd58-q9lk2" event={"ID":"2b86c57b-7125-4ead-88b7-7f5998651f39","Type":"ContainerStarted","Data":"b39e0a1e76be8484755661df6e0bb8b058138c727ce107868e35e1312bde45cb"} Dec 01 20:18:25 crc kubenswrapper[4802]: I1201 20:18:25.625897 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69b4dccd58-q9lk2" event={"ID":"2b86c57b-7125-4ead-88b7-7f5998651f39","Type":"ContainerStarted","Data":"bf6252ce8c0b2d0a6f988842c5d8d87fa76ee9f1bf16c4c6d91359d2719608a5"} Dec 01 20:18:25 crc kubenswrapper[4802]: I1201 20:18:25.625909 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69b4dccd58-q9lk2" event={"ID":"2b86c57b-7125-4ead-88b7-7f5998651f39","Type":"ContainerStarted","Data":"52fdf3dc546d32107a566f84800dca4918660f66aa39cd3290d050eafe99c406"} Dec 01 20:18:25 crc kubenswrapper[4802]: I1201 20:18:25.626701 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-69b4dccd58-q9lk2" Dec 01 20:18:25 crc kubenswrapper[4802]: I1201 20:18:25.626730 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-69b4dccd58-q9lk2" Dec 01 20:18:25 crc kubenswrapper[4802]: I1201 20:18:25.645184 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-67fcb6786-rkbj5" podStartSLOduration=3.211252039 podStartE2EDuration="5.645160519s" podCreationTimestamp="2025-12-01 20:18:20 +0000 UTC" firstStartedPulling="2025-12-01 20:18:22.177645862 +0000 UTC m=+1323.740205503" lastFinishedPulling="2025-12-01 20:18:24.611554342 +0000 UTC m=+1326.174113983" observedRunningTime="2025-12-01 20:18:25.638524602 +0000 UTC m=+1327.201084243" watchObservedRunningTime="2025-12-01 20:18:25.645160519 +0000 UTC m=+1327.207720160" Dec 01 20:18:25 crc kubenswrapper[4802]: I1201 20:18:25.667160 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6f8997c475-6j472" podStartSLOduration=3.155976166 podStartE2EDuration="5.6671406s" podCreationTimestamp="2025-12-01 20:18:20 +0000 UTC" firstStartedPulling="2025-12-01 20:18:22.102731271 +0000 UTC m=+1323.665290912" lastFinishedPulling="2025-12-01 20:18:24.613895705 +0000 UTC m=+1326.176455346" observedRunningTime="2025-12-01 20:18:25.660125819 +0000 UTC m=+1327.222685460" watchObservedRunningTime="2025-12-01 20:18:25.6671406 +0000 UTC m=+1327.229700241" Dec 01 20:18:25 crc kubenswrapper[4802]: I1201 20:18:25.703320 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-69b4dccd58-q9lk2" podStartSLOduration=1.7001765070000001 podStartE2EDuration="1.700176507s" podCreationTimestamp="2025-12-01 20:18:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:18:25.689529013 +0000 UTC m=+1327.252088644" watchObservedRunningTime="2025-12-01 20:18:25.700176507 +0000 UTC m=+1327.262736138" Dec 01 20:18:26 crc kubenswrapper[4802]: I1201 20:18:26.658675 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b30d2288-725d-4cdf-9297-ad9afee36691","Type":"ContainerStarted","Data":"3066aab7dd39a086530bfefe431df425a39c642557553fa121be50b305dd35ab"} Dec 01 20:18:26 crc kubenswrapper[4802]: I1201 20:18:26.671286 4802 generic.go:334] "Generic (PLEG): container finished" podID="93f6c930-0ed7-480e-8725-692427ba2b9d" containerID="746c1a240b5c5c18cc9f61fb2c37fbff5380dc9cad4c85ea9cc09ab494a06a5d" exitCode=0 Dec 01 20:18:26 crc kubenswrapper[4802]: I1201 20:18:26.671464 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-p259k" event={"ID":"93f6c930-0ed7-480e-8725-692427ba2b9d","Type":"ContainerDied","Data":"746c1a240b5c5c18cc9f61fb2c37fbff5380dc9cad4c85ea9cc09ab494a06a5d"} Dec 01 20:18:28 crc kubenswrapper[4802]: I1201 20:18:28.081500 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-p259k" Dec 01 20:18:28 crc kubenswrapper[4802]: I1201 20:18:28.088937 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:18:28 crc kubenswrapper[4802]: I1201 20:18:28.089004 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:18:28 crc kubenswrapper[4802]: I1201 20:18:28.273764 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f6c930-0ed7-480e-8725-692427ba2b9d-combined-ca-bundle\") pod \"93f6c930-0ed7-480e-8725-692427ba2b9d\" (UID: \"93f6c930-0ed7-480e-8725-692427ba2b9d\") " Dec 01 20:18:28 crc kubenswrapper[4802]: I1201 20:18:28.274391 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gzzp\" (UniqueName: \"kubernetes.io/projected/93f6c930-0ed7-480e-8725-692427ba2b9d-kube-api-access-5gzzp\") pod \"93f6c930-0ed7-480e-8725-692427ba2b9d\" (UID: \"93f6c930-0ed7-480e-8725-692427ba2b9d\") " Dec 01 20:18:28 crc kubenswrapper[4802]: I1201 20:18:28.274505 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/93f6c930-0ed7-480e-8725-692427ba2b9d-db-sync-config-data\") pod \"93f6c930-0ed7-480e-8725-692427ba2b9d\" (UID: \"93f6c930-0ed7-480e-8725-692427ba2b9d\") " Dec 01 20:18:28 crc kubenswrapper[4802]: I1201 20:18:28.274536 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93f6c930-0ed7-480e-8725-692427ba2b9d-scripts\") pod \"93f6c930-0ed7-480e-8725-692427ba2b9d\" (UID: \"93f6c930-0ed7-480e-8725-692427ba2b9d\") " Dec 01 20:18:28 crc kubenswrapper[4802]: I1201 20:18:28.274594 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/93f6c930-0ed7-480e-8725-692427ba2b9d-etc-machine-id\") pod \"93f6c930-0ed7-480e-8725-692427ba2b9d\" (UID: \"93f6c930-0ed7-480e-8725-692427ba2b9d\") " Dec 01 20:18:28 crc kubenswrapper[4802]: I1201 20:18:28.274636 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f6c930-0ed7-480e-8725-692427ba2b9d-config-data\") pod \"93f6c930-0ed7-480e-8725-692427ba2b9d\" (UID: \"93f6c930-0ed7-480e-8725-692427ba2b9d\") " Dec 01 20:18:28 crc kubenswrapper[4802]: I1201 20:18:28.274962 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93f6c930-0ed7-480e-8725-692427ba2b9d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "93f6c930-0ed7-480e-8725-692427ba2b9d" (UID: "93f6c930-0ed7-480e-8725-692427ba2b9d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:18:28 crc kubenswrapper[4802]: I1201 20:18:28.276178 4802 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/93f6c930-0ed7-480e-8725-692427ba2b9d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:28 crc kubenswrapper[4802]: I1201 20:18:28.280926 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93f6c930-0ed7-480e-8725-692427ba2b9d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "93f6c930-0ed7-480e-8725-692427ba2b9d" (UID: "93f6c930-0ed7-480e-8725-692427ba2b9d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:28 crc kubenswrapper[4802]: I1201 20:18:28.281572 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93f6c930-0ed7-480e-8725-692427ba2b9d-kube-api-access-5gzzp" (OuterVolumeSpecName: "kube-api-access-5gzzp") pod "93f6c930-0ed7-480e-8725-692427ba2b9d" (UID: "93f6c930-0ed7-480e-8725-692427ba2b9d"). InnerVolumeSpecName "kube-api-access-5gzzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:18:28 crc kubenswrapper[4802]: I1201 20:18:28.283417 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93f6c930-0ed7-480e-8725-692427ba2b9d-scripts" (OuterVolumeSpecName: "scripts") pod "93f6c930-0ed7-480e-8725-692427ba2b9d" (UID: "93f6c930-0ed7-480e-8725-692427ba2b9d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:28 crc kubenswrapper[4802]: I1201 20:18:28.310865 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93f6c930-0ed7-480e-8725-692427ba2b9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93f6c930-0ed7-480e-8725-692427ba2b9d" (UID: "93f6c930-0ed7-480e-8725-692427ba2b9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:28 crc kubenswrapper[4802]: I1201 20:18:28.338324 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93f6c930-0ed7-480e-8725-692427ba2b9d-config-data" (OuterVolumeSpecName: "config-data") pod "93f6c930-0ed7-480e-8725-692427ba2b9d" (UID: "93f6c930-0ed7-480e-8725-692427ba2b9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:28 crc kubenswrapper[4802]: I1201 20:18:28.377993 4802 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/93f6c930-0ed7-480e-8725-692427ba2b9d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:28 crc kubenswrapper[4802]: I1201 20:18:28.378023 4802 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93f6c930-0ed7-480e-8725-692427ba2b9d-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:28 crc kubenswrapper[4802]: I1201 20:18:28.378032 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f6c930-0ed7-480e-8725-692427ba2b9d-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:28 crc kubenswrapper[4802]: I1201 20:18:28.378041 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f6c930-0ed7-480e-8725-692427ba2b9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:28 crc kubenswrapper[4802]: I1201 20:18:28.378049 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gzzp\" (UniqueName: \"kubernetes.io/projected/93f6c930-0ed7-480e-8725-692427ba2b9d-kube-api-access-5gzzp\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:28 crc kubenswrapper[4802]: I1201 20:18:28.696987 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b30d2288-725d-4cdf-9297-ad9afee36691","Type":"ContainerStarted","Data":"f8c2d42f6e0fb964039402c15d7342ab4eb95244ca07a3203b733863945c9c16"} Dec 01 20:18:28 crc kubenswrapper[4802]: I1201 20:18:28.697106 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 20:18:28 crc kubenswrapper[4802]: I1201 20:18:28.700225 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-p259k" event={"ID":"93f6c930-0ed7-480e-8725-692427ba2b9d","Type":"ContainerDied","Data":"ab807c489c268d8ed1c172665d567f9bc6af8e781f2a03fed8d1dcdb5c2add4c"} Dec 01 20:18:28 crc kubenswrapper[4802]: I1201 20:18:28.700256 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-p259k" Dec 01 20:18:28 crc kubenswrapper[4802]: I1201 20:18:28.700273 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab807c489c268d8ed1c172665d567f9bc6af8e781f2a03fed8d1dcdb5c2add4c" Dec 01 20:18:28 crc kubenswrapper[4802]: I1201 20:18:28.736736 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.7902460270000002 podStartE2EDuration="10.736713964s" podCreationTimestamp="2025-12-01 20:18:18 +0000 UTC" firstStartedPulling="2025-12-01 20:18:19.574224442 +0000 UTC m=+1321.136784083" lastFinishedPulling="2025-12-01 20:18:27.520692369 +0000 UTC m=+1329.083252020" observedRunningTime="2025-12-01 20:18:28.718291546 +0000 UTC m=+1330.280851187" watchObservedRunningTime="2025-12-01 20:18:28.736713964 +0000 UTC m=+1330.299273605" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.005689 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 20:18:29 crc kubenswrapper[4802]: E1201 20:18:29.006322 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93f6c930-0ed7-480e-8725-692427ba2b9d" containerName="cinder-db-sync" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.006390 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f6c930-0ed7-480e-8725-692427ba2b9d" containerName="cinder-db-sync" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.006623 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="93f6c930-0ed7-480e-8725-692427ba2b9d" containerName="cinder-db-sync" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.011273 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.015758 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.017661 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.017741 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-296w6" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.017747 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.041327 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.144309 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-wpcpj"] Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.145400 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb684768f-wpcpj" podUID="5a3aac83-f91e-498d-a920-f446126e6aec" containerName="dnsmasq-dns" containerID="cri-o://1fde4a7df9507c3fbdcd3d68fbfead855de14ba9fd927c7670028ff67af6cf5b" gracePeriod=10 Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.147349 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb684768f-wpcpj" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.175148 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-dzvvs"] Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.176834 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-dzvvs" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.192185 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a54d4833-4ceb-4479-8230-f40e3bef89b2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a54d4833-4ceb-4479-8230-f40e3bef89b2\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.192292 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a54d4833-4ceb-4479-8230-f40e3bef89b2-config-data\") pod \"cinder-scheduler-0\" (UID: \"a54d4833-4ceb-4479-8230-f40e3bef89b2\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.192349 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9b8q\" (UniqueName: \"kubernetes.io/projected/a54d4833-4ceb-4479-8230-f40e3bef89b2-kube-api-access-b9b8q\") pod \"cinder-scheduler-0\" (UID: \"a54d4833-4ceb-4479-8230-f40e3bef89b2\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.192420 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a54d4833-4ceb-4479-8230-f40e3bef89b2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a54d4833-4ceb-4479-8230-f40e3bef89b2\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.192443 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a54d4833-4ceb-4479-8230-f40e3bef89b2-scripts\") pod \"cinder-scheduler-0\" (UID: \"a54d4833-4ceb-4479-8230-f40e3bef89b2\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.192484 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a54d4833-4ceb-4479-8230-f40e3bef89b2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a54d4833-4ceb-4479-8230-f40e3bef89b2\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.192695 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-dzvvs"] Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.294275 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3fe3132-a73e-4cab-b8a9-437a51d44de4-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-dzvvs\" (UID: \"b3fe3132-a73e-4cab-b8a9-437a51d44de4\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-dzvvs" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.294792 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a54d4833-4ceb-4479-8230-f40e3bef89b2-config-data\") pod \"cinder-scheduler-0\" (UID: \"a54d4833-4ceb-4479-8230-f40e3bef89b2\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.294830 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3fe3132-a73e-4cab-b8a9-437a51d44de4-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-dzvvs\" (UID: \"b3fe3132-a73e-4cab-b8a9-437a51d44de4\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-dzvvs" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.294857 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9b8q\" (UniqueName: \"kubernetes.io/projected/a54d4833-4ceb-4479-8230-f40e3bef89b2-kube-api-access-b9b8q\") pod \"cinder-scheduler-0\" (UID: \"a54d4833-4ceb-4479-8230-f40e3bef89b2\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.294884 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3fe3132-a73e-4cab-b8a9-437a51d44de4-config\") pod \"dnsmasq-dns-6d97fcdd8f-dzvvs\" (UID: \"b3fe3132-a73e-4cab-b8a9-437a51d44de4\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-dzvvs" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.294928 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a54d4833-4ceb-4479-8230-f40e3bef89b2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a54d4833-4ceb-4479-8230-f40e3bef89b2\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.294949 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjpv2\" (UniqueName: \"kubernetes.io/projected/b3fe3132-a73e-4cab-b8a9-437a51d44de4-kube-api-access-bjpv2\") pod \"dnsmasq-dns-6d97fcdd8f-dzvvs\" (UID: \"b3fe3132-a73e-4cab-b8a9-437a51d44de4\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-dzvvs" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.294968 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a54d4833-4ceb-4479-8230-f40e3bef89b2-scripts\") pod \"cinder-scheduler-0\" (UID: \"a54d4833-4ceb-4479-8230-f40e3bef89b2\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.295006 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3fe3132-a73e-4cab-b8a9-437a51d44de4-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-dzvvs\" (UID: \"b3fe3132-a73e-4cab-b8a9-437a51d44de4\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-dzvvs" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.295026 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a54d4833-4ceb-4479-8230-f40e3bef89b2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a54d4833-4ceb-4479-8230-f40e3bef89b2\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.295052 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a54d4833-4ceb-4479-8230-f40e3bef89b2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a54d4833-4ceb-4479-8230-f40e3bef89b2\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.299531 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a54d4833-4ceb-4479-8230-f40e3bef89b2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a54d4833-4ceb-4479-8230-f40e3bef89b2\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.305122 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a54d4833-4ceb-4479-8230-f40e3bef89b2-config-data\") pod \"cinder-scheduler-0\" (UID: \"a54d4833-4ceb-4479-8230-f40e3bef89b2\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.306750 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a54d4833-4ceb-4479-8230-f40e3bef89b2-scripts\") pod \"cinder-scheduler-0\" (UID: \"a54d4833-4ceb-4479-8230-f40e3bef89b2\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.316072 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a54d4833-4ceb-4479-8230-f40e3bef89b2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a54d4833-4ceb-4479-8230-f40e3bef89b2\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.325040 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9b8q\" (UniqueName: \"kubernetes.io/projected/a54d4833-4ceb-4479-8230-f40e3bef89b2-kube-api-access-b9b8q\") pod \"cinder-scheduler-0\" (UID: \"a54d4833-4ceb-4479-8230-f40e3bef89b2\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.325900 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a54d4833-4ceb-4479-8230-f40e3bef89b2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a54d4833-4ceb-4479-8230-f40e3bef89b2\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.354729 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.358483 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.365682 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.371546 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.384264 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.396654 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3fe3132-a73e-4cab-b8a9-437a51d44de4-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-dzvvs\" (UID: \"b3fe3132-a73e-4cab-b8a9-437a51d44de4\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-dzvvs" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.396738 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3fe3132-a73e-4cab-b8a9-437a51d44de4-config\") pod \"dnsmasq-dns-6d97fcdd8f-dzvvs\" (UID: \"b3fe3132-a73e-4cab-b8a9-437a51d44de4\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-dzvvs" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.396802 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjpv2\" (UniqueName: \"kubernetes.io/projected/b3fe3132-a73e-4cab-b8a9-437a51d44de4-kube-api-access-bjpv2\") pod \"dnsmasq-dns-6d97fcdd8f-dzvvs\" (UID: \"b3fe3132-a73e-4cab-b8a9-437a51d44de4\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-dzvvs" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.396848 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3fe3132-a73e-4cab-b8a9-437a51d44de4-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-dzvvs\" (UID: \"b3fe3132-a73e-4cab-b8a9-437a51d44de4\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-dzvvs" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.396903 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3fe3132-a73e-4cab-b8a9-437a51d44de4-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-dzvvs\" (UID: \"b3fe3132-a73e-4cab-b8a9-437a51d44de4\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-dzvvs" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.404832 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3fe3132-a73e-4cab-b8a9-437a51d44de4-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-dzvvs\" (UID: \"b3fe3132-a73e-4cab-b8a9-437a51d44de4\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-dzvvs" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.404927 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3fe3132-a73e-4cab-b8a9-437a51d44de4-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-dzvvs\" (UID: \"b3fe3132-a73e-4cab-b8a9-437a51d44de4\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-dzvvs" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.405592 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3fe3132-a73e-4cab-b8a9-437a51d44de4-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-dzvvs\" (UID: \"b3fe3132-a73e-4cab-b8a9-437a51d44de4\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-dzvvs" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.405648 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3fe3132-a73e-4cab-b8a9-437a51d44de4-config\") pod \"dnsmasq-dns-6d97fcdd8f-dzvvs\" (UID: \"b3fe3132-a73e-4cab-b8a9-437a51d44de4\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-dzvvs" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.434517 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjpv2\" (UniqueName: \"kubernetes.io/projected/b3fe3132-a73e-4cab-b8a9-437a51d44de4-kube-api-access-bjpv2\") pod \"dnsmasq-dns-6d97fcdd8f-dzvvs\" (UID: \"b3fe3132-a73e-4cab-b8a9-437a51d44de4\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-dzvvs" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.502426 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1305c34-0728-4332-91d4-2d89af98017c-config-data\") pod \"cinder-api-0\" (UID: \"e1305c34-0728-4332-91d4-2d89af98017c\") " pod="openstack/cinder-api-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.502479 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kq7n\" (UniqueName: \"kubernetes.io/projected/e1305c34-0728-4332-91d4-2d89af98017c-kube-api-access-2kq7n\") pod \"cinder-api-0\" (UID: \"e1305c34-0728-4332-91d4-2d89af98017c\") " pod="openstack/cinder-api-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.502523 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1305c34-0728-4332-91d4-2d89af98017c-scripts\") pod \"cinder-api-0\" (UID: \"e1305c34-0728-4332-91d4-2d89af98017c\") " pod="openstack/cinder-api-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.502578 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1305c34-0728-4332-91d4-2d89af98017c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e1305c34-0728-4332-91d4-2d89af98017c\") " pod="openstack/cinder-api-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.502609 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e1305c34-0728-4332-91d4-2d89af98017c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e1305c34-0728-4332-91d4-2d89af98017c\") " pod="openstack/cinder-api-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.502635 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1305c34-0728-4332-91d4-2d89af98017c-logs\") pod \"cinder-api-0\" (UID: \"e1305c34-0728-4332-91d4-2d89af98017c\") " pod="openstack/cinder-api-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.502660 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1305c34-0728-4332-91d4-2d89af98017c-config-data-custom\") pod \"cinder-api-0\" (UID: \"e1305c34-0728-4332-91d4-2d89af98017c\") " pod="openstack/cinder-api-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.539564 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-dzvvs" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.608312 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1305c34-0728-4332-91d4-2d89af98017c-config-data\") pod \"cinder-api-0\" (UID: \"e1305c34-0728-4332-91d4-2d89af98017c\") " pod="openstack/cinder-api-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.608352 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kq7n\" (UniqueName: \"kubernetes.io/projected/e1305c34-0728-4332-91d4-2d89af98017c-kube-api-access-2kq7n\") pod \"cinder-api-0\" (UID: \"e1305c34-0728-4332-91d4-2d89af98017c\") " pod="openstack/cinder-api-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.608401 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1305c34-0728-4332-91d4-2d89af98017c-scripts\") pod \"cinder-api-0\" (UID: \"e1305c34-0728-4332-91d4-2d89af98017c\") " pod="openstack/cinder-api-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.608458 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1305c34-0728-4332-91d4-2d89af98017c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e1305c34-0728-4332-91d4-2d89af98017c\") " pod="openstack/cinder-api-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.608489 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e1305c34-0728-4332-91d4-2d89af98017c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e1305c34-0728-4332-91d4-2d89af98017c\") " pod="openstack/cinder-api-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.608514 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1305c34-0728-4332-91d4-2d89af98017c-logs\") pod \"cinder-api-0\" (UID: \"e1305c34-0728-4332-91d4-2d89af98017c\") " pod="openstack/cinder-api-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.608540 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1305c34-0728-4332-91d4-2d89af98017c-config-data-custom\") pod \"cinder-api-0\" (UID: \"e1305c34-0728-4332-91d4-2d89af98017c\") " pod="openstack/cinder-api-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.608691 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e1305c34-0728-4332-91d4-2d89af98017c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e1305c34-0728-4332-91d4-2d89af98017c\") " pod="openstack/cinder-api-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.609644 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1305c34-0728-4332-91d4-2d89af98017c-logs\") pod \"cinder-api-0\" (UID: \"e1305c34-0728-4332-91d4-2d89af98017c\") " pod="openstack/cinder-api-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.616451 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1305c34-0728-4332-91d4-2d89af98017c-scripts\") pod \"cinder-api-0\" (UID: \"e1305c34-0728-4332-91d4-2d89af98017c\") " pod="openstack/cinder-api-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.617586 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1305c34-0728-4332-91d4-2d89af98017c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e1305c34-0728-4332-91d4-2d89af98017c\") " pod="openstack/cinder-api-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.618804 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1305c34-0728-4332-91d4-2d89af98017c-config-data-custom\") pod \"cinder-api-0\" (UID: \"e1305c34-0728-4332-91d4-2d89af98017c\") " pod="openstack/cinder-api-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.621423 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1305c34-0728-4332-91d4-2d89af98017c-config-data\") pod \"cinder-api-0\" (UID: \"e1305c34-0728-4332-91d4-2d89af98017c\") " pod="openstack/cinder-api-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.635704 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kq7n\" (UniqueName: \"kubernetes.io/projected/e1305c34-0728-4332-91d4-2d89af98017c-kube-api-access-2kq7n\") pod \"cinder-api-0\" (UID: \"e1305c34-0728-4332-91d4-2d89af98017c\") " pod="openstack/cinder-api-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.727102 4802 generic.go:334] "Generic (PLEG): container finished" podID="5a3aac83-f91e-498d-a920-f446126e6aec" containerID="1fde4a7df9507c3fbdcd3d68fbfead855de14ba9fd927c7670028ff67af6cf5b" exitCode=0 Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.728109 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-wpcpj" event={"ID":"5a3aac83-f91e-498d-a920-f446126e6aec","Type":"ContainerDied","Data":"1fde4a7df9507c3fbdcd3d68fbfead855de14ba9fd927c7670028ff67af6cf5b"} Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.754750 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.814271 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-wpcpj" Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.914157 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a3aac83-f91e-498d-a920-f446126e6aec-config\") pod \"5a3aac83-f91e-498d-a920-f446126e6aec\" (UID: \"5a3aac83-f91e-498d-a920-f446126e6aec\") " Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.914420 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a3aac83-f91e-498d-a920-f446126e6aec-ovsdbserver-nb\") pod \"5a3aac83-f91e-498d-a920-f446126e6aec\" (UID: \"5a3aac83-f91e-498d-a920-f446126e6aec\") " Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.914461 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw8c4\" (UniqueName: \"kubernetes.io/projected/5a3aac83-f91e-498d-a920-f446126e6aec-kube-api-access-mw8c4\") pod \"5a3aac83-f91e-498d-a920-f446126e6aec\" (UID: \"5a3aac83-f91e-498d-a920-f446126e6aec\") " Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.914496 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a3aac83-f91e-498d-a920-f446126e6aec-ovsdbserver-sb\") pod \"5a3aac83-f91e-498d-a920-f446126e6aec\" (UID: \"5a3aac83-f91e-498d-a920-f446126e6aec\") " Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.914588 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a3aac83-f91e-498d-a920-f446126e6aec-dns-svc\") pod \"5a3aac83-f91e-498d-a920-f446126e6aec\" (UID: \"5a3aac83-f91e-498d-a920-f446126e6aec\") " Dec 01 20:18:29 crc kubenswrapper[4802]: I1201 20:18:29.920426 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a3aac83-f91e-498d-a920-f446126e6aec-kube-api-access-mw8c4" (OuterVolumeSpecName: "kube-api-access-mw8c4") pod "5a3aac83-f91e-498d-a920-f446126e6aec" (UID: "5a3aac83-f91e-498d-a920-f446126e6aec"). InnerVolumeSpecName "kube-api-access-mw8c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:18:30 crc kubenswrapper[4802]: I1201 20:18:30.014123 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a3aac83-f91e-498d-a920-f446126e6aec-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5a3aac83-f91e-498d-a920-f446126e6aec" (UID: "5a3aac83-f91e-498d-a920-f446126e6aec"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:18:30 crc kubenswrapper[4802]: I1201 20:18:30.022880 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw8c4\" (UniqueName: \"kubernetes.io/projected/5a3aac83-f91e-498d-a920-f446126e6aec-kube-api-access-mw8c4\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:30 crc kubenswrapper[4802]: I1201 20:18:30.022934 4802 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a3aac83-f91e-498d-a920-f446126e6aec-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:30 crc kubenswrapper[4802]: I1201 20:18:30.028150 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a3aac83-f91e-498d-a920-f446126e6aec-config" (OuterVolumeSpecName: "config") pod "5a3aac83-f91e-498d-a920-f446126e6aec" (UID: "5a3aac83-f91e-498d-a920-f446126e6aec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:18:30 crc kubenswrapper[4802]: I1201 20:18:30.049932 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 20:18:30 crc kubenswrapper[4802]: I1201 20:18:30.060905 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a3aac83-f91e-498d-a920-f446126e6aec-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5a3aac83-f91e-498d-a920-f446126e6aec" (UID: "5a3aac83-f91e-498d-a920-f446126e6aec"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:18:30 crc kubenswrapper[4802]: I1201 20:18:30.098408 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a3aac83-f91e-498d-a920-f446126e6aec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5a3aac83-f91e-498d-a920-f446126e6aec" (UID: "5a3aac83-f91e-498d-a920-f446126e6aec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:18:30 crc kubenswrapper[4802]: I1201 20:18:30.105176 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-dzvvs"] Dec 01 20:18:30 crc kubenswrapper[4802]: I1201 20:18:30.126245 4802 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a3aac83-f91e-498d-a920-f446126e6aec-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:30 crc kubenswrapper[4802]: I1201 20:18:30.126555 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a3aac83-f91e-498d-a920-f446126e6aec-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:30 crc kubenswrapper[4802]: I1201 20:18:30.126709 4802 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a3aac83-f91e-498d-a920-f446126e6aec-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:30 crc kubenswrapper[4802]: I1201 20:18:30.370382 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 20:18:30 crc kubenswrapper[4802]: I1201 20:18:30.762432 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-wpcpj" Dec 01 20:18:30 crc kubenswrapper[4802]: I1201 20:18:30.764839 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-wpcpj" event={"ID":"5a3aac83-f91e-498d-a920-f446126e6aec","Type":"ContainerDied","Data":"e316eca1171da737ac5975e193425c80b559e7827aae3bb022ab9b4db101ceb9"} Dec 01 20:18:30 crc kubenswrapper[4802]: I1201 20:18:30.764898 4802 scope.go:117] "RemoveContainer" containerID="1fde4a7df9507c3fbdcd3d68fbfead855de14ba9fd927c7670028ff67af6cf5b" Dec 01 20:18:30 crc kubenswrapper[4802]: I1201 20:18:30.773841 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e1305c34-0728-4332-91d4-2d89af98017c","Type":"ContainerStarted","Data":"88c3579422b444f6e8ce95a8c4f3ebd43695000417a49b9db07efcad3160c95a"} Dec 01 20:18:30 crc kubenswrapper[4802]: I1201 20:18:30.789584 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a54d4833-4ceb-4479-8230-f40e3bef89b2","Type":"ContainerStarted","Data":"6ba975b8780b0250296c6499ab5a5864dccf1dac713c043b9465aae286697807"} Dec 01 20:18:30 crc kubenswrapper[4802]: I1201 20:18:30.795499 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-wpcpj"] Dec 01 20:18:30 crc kubenswrapper[4802]: I1201 20:18:30.802264 4802 generic.go:334] "Generic (PLEG): container finished" podID="b3fe3132-a73e-4cab-b8a9-437a51d44de4" containerID="24d659ff52548b0681ea4d7c76fc34e1a180b08510ddf3d76ac98a6e88753690" exitCode=0 Dec 01 20:18:30 crc kubenswrapper[4802]: I1201 20:18:30.802309 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-dzvvs" event={"ID":"b3fe3132-a73e-4cab-b8a9-437a51d44de4","Type":"ContainerDied","Data":"24d659ff52548b0681ea4d7c76fc34e1a180b08510ddf3d76ac98a6e88753690"} Dec 01 20:18:30 crc kubenswrapper[4802]: I1201 20:18:30.802342 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-dzvvs" event={"ID":"b3fe3132-a73e-4cab-b8a9-437a51d44de4","Type":"ContainerStarted","Data":"d7460f95a4223a0ac423aaa7923b9d43623a2eb25d7ab47818a03b9846854a7a"} Dec 01 20:18:30 crc kubenswrapper[4802]: I1201 20:18:30.803980 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-wpcpj"] Dec 01 20:18:30 crc kubenswrapper[4802]: I1201 20:18:30.820477 4802 scope.go:117] "RemoveContainer" containerID="ec26a21884d8373f7cf1f6f0c5076f574dd4375ddc5882343771e9d89ed4b5d7" Dec 01 20:18:31 crc kubenswrapper[4802]: I1201 20:18:31.748148 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 01 20:18:31 crc kubenswrapper[4802]: I1201 20:18:31.819355 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-dzvvs" event={"ID":"b3fe3132-a73e-4cab-b8a9-437a51d44de4","Type":"ContainerStarted","Data":"4da5faaaa0b6e4b5a5bd09ce53a310df37b626ebb99a979a3f41439c0aa23564"} Dec 01 20:18:31 crc kubenswrapper[4802]: I1201 20:18:31.819459 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d97fcdd8f-dzvvs" Dec 01 20:18:31 crc kubenswrapper[4802]: I1201 20:18:31.840127 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e1305c34-0728-4332-91d4-2d89af98017c","Type":"ContainerStarted","Data":"8dde9c1673c544ffe770e6f16c9c755577d1a47c2a3e52871b62c1cade2dc45c"} Dec 01 20:18:31 crc kubenswrapper[4802]: I1201 20:18:31.841601 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d97fcdd8f-dzvvs" podStartSLOduration=2.841577827 podStartE2EDuration="2.841577827s" podCreationTimestamp="2025-12-01 20:18:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:18:31.839970997 +0000 UTC m=+1333.402530638" watchObservedRunningTime="2025-12-01 20:18:31.841577827 +0000 UTC m=+1333.404137468" Dec 01 20:18:32 crc kubenswrapper[4802]: I1201 20:18:32.736717 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a3aac83-f91e-498d-a920-f446126e6aec" path="/var/lib/kubelet/pods/5a3aac83-f91e-498d-a920-f446126e6aec/volumes" Dec 01 20:18:32 crc kubenswrapper[4802]: I1201 20:18:32.868137 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a54d4833-4ceb-4479-8230-f40e3bef89b2","Type":"ContainerStarted","Data":"3e659d53c40c29a7b65a1de0f4e1a9377e603ff95bac2c74ea41b5940b2ff97d"} Dec 01 20:18:32 crc kubenswrapper[4802]: I1201 20:18:32.874003 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e1305c34-0728-4332-91d4-2d89af98017c" containerName="cinder-api-log" containerID="cri-o://8dde9c1673c544ffe770e6f16c9c755577d1a47c2a3e52871b62c1cade2dc45c" gracePeriod=30 Dec 01 20:18:32 crc kubenswrapper[4802]: I1201 20:18:32.874087 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e1305c34-0728-4332-91d4-2d89af98017c","Type":"ContainerStarted","Data":"f69494d7d09f209df28c5aeab20bf01494ef0bfe0f85c1e2a26fa4dac3f2329a"} Dec 01 20:18:32 crc kubenswrapper[4802]: I1201 20:18:32.874126 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 01 20:18:32 crc kubenswrapper[4802]: I1201 20:18:32.874442 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e1305c34-0728-4332-91d4-2d89af98017c" containerName="cinder-api" containerID="cri-o://f69494d7d09f209df28c5aeab20bf01494ef0bfe0f85c1e2a26fa4dac3f2329a" gracePeriod=30 Dec 01 20:18:32 crc kubenswrapper[4802]: I1201 20:18:32.901955 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.9019357550000002 podStartE2EDuration="3.901935755s" podCreationTimestamp="2025-12-01 20:18:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:18:32.900589303 +0000 UTC m=+1334.463148944" watchObservedRunningTime="2025-12-01 20:18:32.901935755 +0000 UTC m=+1334.464495416" Dec 01 20:18:33 crc kubenswrapper[4802]: I1201 20:18:33.694836 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-86bd8cd97b-zmmcw" Dec 01 20:18:33 crc kubenswrapper[4802]: I1201 20:18:33.720604 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-86bd8cd97b-zmmcw" Dec 01 20:18:33 crc kubenswrapper[4802]: I1201 20:18:33.872403 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 20:18:33 crc kubenswrapper[4802]: I1201 20:18:33.903987 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a54d4833-4ceb-4479-8230-f40e3bef89b2","Type":"ContainerStarted","Data":"0a7f20f037d4ef7e92dec63fc6495e11a196653577e8e87b9e57a00915c4631c"} Dec 01 20:18:33 crc kubenswrapper[4802]: I1201 20:18:33.908006 4802 generic.go:334] "Generic (PLEG): container finished" podID="e1305c34-0728-4332-91d4-2d89af98017c" containerID="f69494d7d09f209df28c5aeab20bf01494ef0bfe0f85c1e2a26fa4dac3f2329a" exitCode=0 Dec 01 20:18:33 crc kubenswrapper[4802]: I1201 20:18:33.908058 4802 generic.go:334] "Generic (PLEG): container finished" podID="e1305c34-0728-4332-91d4-2d89af98017c" containerID="8dde9c1673c544ffe770e6f16c9c755577d1a47c2a3e52871b62c1cade2dc45c" exitCode=143 Dec 01 20:18:33 crc kubenswrapper[4802]: I1201 20:18:33.908326 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e1305c34-0728-4332-91d4-2d89af98017c","Type":"ContainerDied","Data":"f69494d7d09f209df28c5aeab20bf01494ef0bfe0f85c1e2a26fa4dac3f2329a"} Dec 01 20:18:33 crc kubenswrapper[4802]: I1201 20:18:33.908385 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e1305c34-0728-4332-91d4-2d89af98017c","Type":"ContainerDied","Data":"8dde9c1673c544ffe770e6f16c9c755577d1a47c2a3e52871b62c1cade2dc45c"} Dec 01 20:18:33 crc kubenswrapper[4802]: I1201 20:18:33.908399 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e1305c34-0728-4332-91d4-2d89af98017c","Type":"ContainerDied","Data":"88c3579422b444f6e8ce95a8c4f3ebd43695000417a49b9db07efcad3160c95a"} Dec 01 20:18:33 crc kubenswrapper[4802]: I1201 20:18:33.908414 4802 scope.go:117] "RemoveContainer" containerID="f69494d7d09f209df28c5aeab20bf01494ef0bfe0f85c1e2a26fa4dac3f2329a" Dec 01 20:18:33 crc kubenswrapper[4802]: I1201 20:18:33.908521 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 20:18:33 crc kubenswrapper[4802]: I1201 20:18:33.944223 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.54056196 podStartE2EDuration="5.944180395s" podCreationTimestamp="2025-12-01 20:18:28 +0000 UTC" firstStartedPulling="2025-12-01 20:18:30.016563493 +0000 UTC m=+1331.579123134" lastFinishedPulling="2025-12-01 20:18:31.420181928 +0000 UTC m=+1332.982741569" observedRunningTime="2025-12-01 20:18:33.933360105 +0000 UTC m=+1335.495919756" watchObservedRunningTime="2025-12-01 20:18:33.944180395 +0000 UTC m=+1335.506740036" Dec 01 20:18:33 crc kubenswrapper[4802]: I1201 20:18:33.960888 4802 scope.go:117] "RemoveContainer" containerID="8dde9c1673c544ffe770e6f16c9c755577d1a47c2a3e52871b62c1cade2dc45c" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.003366 4802 scope.go:117] "RemoveContainer" containerID="f69494d7d09f209df28c5aeab20bf01494ef0bfe0f85c1e2a26fa4dac3f2329a" Dec 01 20:18:34 crc kubenswrapper[4802]: E1201 20:18:34.013008 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f69494d7d09f209df28c5aeab20bf01494ef0bfe0f85c1e2a26fa4dac3f2329a\": container with ID starting with f69494d7d09f209df28c5aeab20bf01494ef0bfe0f85c1e2a26fa4dac3f2329a not found: ID does not exist" containerID="f69494d7d09f209df28c5aeab20bf01494ef0bfe0f85c1e2a26fa4dac3f2329a" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.013090 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f69494d7d09f209df28c5aeab20bf01494ef0bfe0f85c1e2a26fa4dac3f2329a"} err="failed to get container status \"f69494d7d09f209df28c5aeab20bf01494ef0bfe0f85c1e2a26fa4dac3f2329a\": rpc error: code = NotFound desc = could not find container \"f69494d7d09f209df28c5aeab20bf01494ef0bfe0f85c1e2a26fa4dac3f2329a\": container with ID starting with f69494d7d09f209df28c5aeab20bf01494ef0bfe0f85c1e2a26fa4dac3f2329a not found: ID does not exist" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.013139 4802 scope.go:117] "RemoveContainer" containerID="8dde9c1673c544ffe770e6f16c9c755577d1a47c2a3e52871b62c1cade2dc45c" Dec 01 20:18:34 crc kubenswrapper[4802]: E1201 20:18:34.013802 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dde9c1673c544ffe770e6f16c9c755577d1a47c2a3e52871b62c1cade2dc45c\": container with ID starting with 8dde9c1673c544ffe770e6f16c9c755577d1a47c2a3e52871b62c1cade2dc45c not found: ID does not exist" containerID="8dde9c1673c544ffe770e6f16c9c755577d1a47c2a3e52871b62c1cade2dc45c" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.013835 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dde9c1673c544ffe770e6f16c9c755577d1a47c2a3e52871b62c1cade2dc45c"} err="failed to get container status \"8dde9c1673c544ffe770e6f16c9c755577d1a47c2a3e52871b62c1cade2dc45c\": rpc error: code = NotFound desc = could not find container \"8dde9c1673c544ffe770e6f16c9c755577d1a47c2a3e52871b62c1cade2dc45c\": container with ID starting with 8dde9c1673c544ffe770e6f16c9c755577d1a47c2a3e52871b62c1cade2dc45c not found: ID does not exist" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.013856 4802 scope.go:117] "RemoveContainer" containerID="f69494d7d09f209df28c5aeab20bf01494ef0bfe0f85c1e2a26fa4dac3f2329a" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.014141 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f69494d7d09f209df28c5aeab20bf01494ef0bfe0f85c1e2a26fa4dac3f2329a"} err="failed to get container status \"f69494d7d09f209df28c5aeab20bf01494ef0bfe0f85c1e2a26fa4dac3f2329a\": rpc error: code = NotFound desc = could not find container \"f69494d7d09f209df28c5aeab20bf01494ef0bfe0f85c1e2a26fa4dac3f2329a\": container with ID starting with f69494d7d09f209df28c5aeab20bf01494ef0bfe0f85c1e2a26fa4dac3f2329a not found: ID does not exist" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.014169 4802 scope.go:117] "RemoveContainer" containerID="8dde9c1673c544ffe770e6f16c9c755577d1a47c2a3e52871b62c1cade2dc45c" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.014601 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dde9c1673c544ffe770e6f16c9c755577d1a47c2a3e52871b62c1cade2dc45c"} err="failed to get container status \"8dde9c1673c544ffe770e6f16c9c755577d1a47c2a3e52871b62c1cade2dc45c\": rpc error: code = NotFound desc = could not find container \"8dde9c1673c544ffe770e6f16c9c755577d1a47c2a3e52871b62c1cade2dc45c\": container with ID starting with 8dde9c1673c544ffe770e6f16c9c755577d1a47c2a3e52871b62c1cade2dc45c not found: ID does not exist" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.038257 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1305c34-0728-4332-91d4-2d89af98017c-config-data\") pod \"e1305c34-0728-4332-91d4-2d89af98017c\" (UID: \"e1305c34-0728-4332-91d4-2d89af98017c\") " Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.038406 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kq7n\" (UniqueName: \"kubernetes.io/projected/e1305c34-0728-4332-91d4-2d89af98017c-kube-api-access-2kq7n\") pod \"e1305c34-0728-4332-91d4-2d89af98017c\" (UID: \"e1305c34-0728-4332-91d4-2d89af98017c\") " Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.038609 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1305c34-0728-4332-91d4-2d89af98017c-scripts\") pod \"e1305c34-0728-4332-91d4-2d89af98017c\" (UID: \"e1305c34-0728-4332-91d4-2d89af98017c\") " Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.038689 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1305c34-0728-4332-91d4-2d89af98017c-config-data-custom\") pod \"e1305c34-0728-4332-91d4-2d89af98017c\" (UID: \"e1305c34-0728-4332-91d4-2d89af98017c\") " Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.038740 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e1305c34-0728-4332-91d4-2d89af98017c-etc-machine-id\") pod \"e1305c34-0728-4332-91d4-2d89af98017c\" (UID: \"e1305c34-0728-4332-91d4-2d89af98017c\") " Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.038765 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1305c34-0728-4332-91d4-2d89af98017c-logs\") pod \"e1305c34-0728-4332-91d4-2d89af98017c\" (UID: \"e1305c34-0728-4332-91d4-2d89af98017c\") " Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.038828 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1305c34-0728-4332-91d4-2d89af98017c-combined-ca-bundle\") pod \"e1305c34-0728-4332-91d4-2d89af98017c\" (UID: \"e1305c34-0728-4332-91d4-2d89af98017c\") " Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.039315 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1305c34-0728-4332-91d4-2d89af98017c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e1305c34-0728-4332-91d4-2d89af98017c" (UID: "e1305c34-0728-4332-91d4-2d89af98017c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.040143 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1305c34-0728-4332-91d4-2d89af98017c-logs" (OuterVolumeSpecName: "logs") pod "e1305c34-0728-4332-91d4-2d89af98017c" (UID: "e1305c34-0728-4332-91d4-2d89af98017c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.040733 4802 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e1305c34-0728-4332-91d4-2d89af98017c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.040755 4802 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1305c34-0728-4332-91d4-2d89af98017c-logs\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.047456 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1305c34-0728-4332-91d4-2d89af98017c-kube-api-access-2kq7n" (OuterVolumeSpecName: "kube-api-access-2kq7n") pod "e1305c34-0728-4332-91d4-2d89af98017c" (UID: "e1305c34-0728-4332-91d4-2d89af98017c"). InnerVolumeSpecName "kube-api-access-2kq7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.056372 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1305c34-0728-4332-91d4-2d89af98017c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e1305c34-0728-4332-91d4-2d89af98017c" (UID: "e1305c34-0728-4332-91d4-2d89af98017c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.074364 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1305c34-0728-4332-91d4-2d89af98017c-scripts" (OuterVolumeSpecName: "scripts") pod "e1305c34-0728-4332-91d4-2d89af98017c" (UID: "e1305c34-0728-4332-91d4-2d89af98017c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.101804 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1305c34-0728-4332-91d4-2d89af98017c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1305c34-0728-4332-91d4-2d89af98017c" (UID: "e1305c34-0728-4332-91d4-2d89af98017c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.143899 4802 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1305c34-0728-4332-91d4-2d89af98017c-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.144001 4802 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1305c34-0728-4332-91d4-2d89af98017c-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.144017 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1305c34-0728-4332-91d4-2d89af98017c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.144033 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kq7n\" (UniqueName: \"kubernetes.io/projected/e1305c34-0728-4332-91d4-2d89af98017c-kube-api-access-2kq7n\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.174528 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1305c34-0728-4332-91d4-2d89af98017c-config-data" (OuterVolumeSpecName: "config-data") pod "e1305c34-0728-4332-91d4-2d89af98017c" (UID: "e1305c34-0728-4332-91d4-2d89af98017c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.247143 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1305c34-0728-4332-91d4-2d89af98017c-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.271484 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.293494 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.332669 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 01 20:18:34 crc kubenswrapper[4802]: E1201 20:18:34.333230 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a3aac83-f91e-498d-a920-f446126e6aec" containerName="init" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.333252 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a3aac83-f91e-498d-a920-f446126e6aec" containerName="init" Dec 01 20:18:34 crc kubenswrapper[4802]: E1201 20:18:34.333272 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a3aac83-f91e-498d-a920-f446126e6aec" containerName="dnsmasq-dns" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.333279 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a3aac83-f91e-498d-a920-f446126e6aec" containerName="dnsmasq-dns" Dec 01 20:18:34 crc kubenswrapper[4802]: E1201 20:18:34.333302 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1305c34-0728-4332-91d4-2d89af98017c" containerName="cinder-api" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.333309 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1305c34-0728-4332-91d4-2d89af98017c" containerName="cinder-api" Dec 01 20:18:34 crc kubenswrapper[4802]: E1201 20:18:34.333337 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1305c34-0728-4332-91d4-2d89af98017c" containerName="cinder-api-log" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.333344 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1305c34-0728-4332-91d4-2d89af98017c" containerName="cinder-api-log" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.333527 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1305c34-0728-4332-91d4-2d89af98017c" containerName="cinder-api" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.333547 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a3aac83-f91e-498d-a920-f446126e6aec" containerName="dnsmasq-dns" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.333564 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1305c34-0728-4332-91d4-2d89af98017c" containerName="cinder-api-log" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.334655 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.339645 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.339985 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.340506 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.357368 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.389200 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.453148 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wgc7\" (UniqueName: \"kubernetes.io/projected/372989c2-e54c-4031-9b41-926f7be64266-kube-api-access-2wgc7\") pod \"cinder-api-0\" (UID: \"372989c2-e54c-4031-9b41-926f7be64266\") " pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.453199 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/372989c2-e54c-4031-9b41-926f7be64266-logs\") pod \"cinder-api-0\" (UID: \"372989c2-e54c-4031-9b41-926f7be64266\") " pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.453251 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/372989c2-e54c-4031-9b41-926f7be64266-scripts\") pod \"cinder-api-0\" (UID: \"372989c2-e54c-4031-9b41-926f7be64266\") " pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.453285 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/372989c2-e54c-4031-9b41-926f7be64266-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"372989c2-e54c-4031-9b41-926f7be64266\") " pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.453308 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/372989c2-e54c-4031-9b41-926f7be64266-public-tls-certs\") pod \"cinder-api-0\" (UID: \"372989c2-e54c-4031-9b41-926f7be64266\") " pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.453335 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/372989c2-e54c-4031-9b41-926f7be64266-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"372989c2-e54c-4031-9b41-926f7be64266\") " pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.453417 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/372989c2-e54c-4031-9b41-926f7be64266-etc-machine-id\") pod \"cinder-api-0\" (UID: \"372989c2-e54c-4031-9b41-926f7be64266\") " pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.453436 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/372989c2-e54c-4031-9b41-926f7be64266-config-data\") pod \"cinder-api-0\" (UID: \"372989c2-e54c-4031-9b41-926f7be64266\") " pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.453452 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/372989c2-e54c-4031-9b41-926f7be64266-config-data-custom\") pod \"cinder-api-0\" (UID: \"372989c2-e54c-4031-9b41-926f7be64266\") " pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.555918 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/372989c2-e54c-4031-9b41-926f7be64266-etc-machine-id\") pod \"cinder-api-0\" (UID: \"372989c2-e54c-4031-9b41-926f7be64266\") " pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.555989 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/372989c2-e54c-4031-9b41-926f7be64266-config-data\") pod \"cinder-api-0\" (UID: \"372989c2-e54c-4031-9b41-926f7be64266\") " pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.556021 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/372989c2-e54c-4031-9b41-926f7be64266-config-data-custom\") pod \"cinder-api-0\" (UID: \"372989c2-e54c-4031-9b41-926f7be64266\") " pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.556076 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wgc7\" (UniqueName: \"kubernetes.io/projected/372989c2-e54c-4031-9b41-926f7be64266-kube-api-access-2wgc7\") pod \"cinder-api-0\" (UID: \"372989c2-e54c-4031-9b41-926f7be64266\") " pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.556113 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/372989c2-e54c-4031-9b41-926f7be64266-logs\") pod \"cinder-api-0\" (UID: \"372989c2-e54c-4031-9b41-926f7be64266\") " pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.556166 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/372989c2-e54c-4031-9b41-926f7be64266-scripts\") pod \"cinder-api-0\" (UID: \"372989c2-e54c-4031-9b41-926f7be64266\") " pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.556242 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/372989c2-e54c-4031-9b41-926f7be64266-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"372989c2-e54c-4031-9b41-926f7be64266\") " pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.556288 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/372989c2-e54c-4031-9b41-926f7be64266-public-tls-certs\") pod \"cinder-api-0\" (UID: \"372989c2-e54c-4031-9b41-926f7be64266\") " pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.556335 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/372989c2-e54c-4031-9b41-926f7be64266-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"372989c2-e54c-4031-9b41-926f7be64266\") " pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.556939 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/372989c2-e54c-4031-9b41-926f7be64266-etc-machine-id\") pod \"cinder-api-0\" (UID: \"372989c2-e54c-4031-9b41-926f7be64266\") " pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.561428 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/372989c2-e54c-4031-9b41-926f7be64266-logs\") pod \"cinder-api-0\" (UID: \"372989c2-e54c-4031-9b41-926f7be64266\") " pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.562707 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/372989c2-e54c-4031-9b41-926f7be64266-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"372989c2-e54c-4031-9b41-926f7be64266\") " pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.565941 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/372989c2-e54c-4031-9b41-926f7be64266-public-tls-certs\") pod \"cinder-api-0\" (UID: \"372989c2-e54c-4031-9b41-926f7be64266\") " pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.565953 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/372989c2-e54c-4031-9b41-926f7be64266-scripts\") pod \"cinder-api-0\" (UID: \"372989c2-e54c-4031-9b41-926f7be64266\") " pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.566445 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/372989c2-e54c-4031-9b41-926f7be64266-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"372989c2-e54c-4031-9b41-926f7be64266\") " pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.570795 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/372989c2-e54c-4031-9b41-926f7be64266-config-data-custom\") pod \"cinder-api-0\" (UID: \"372989c2-e54c-4031-9b41-926f7be64266\") " pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.574366 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/372989c2-e54c-4031-9b41-926f7be64266-config-data\") pod \"cinder-api-0\" (UID: \"372989c2-e54c-4031-9b41-926f7be64266\") " pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.577911 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wgc7\" (UniqueName: \"kubernetes.io/projected/372989c2-e54c-4031-9b41-926f7be64266-kube-api-access-2wgc7\") pod \"cinder-api-0\" (UID: \"372989c2-e54c-4031-9b41-926f7be64266\") " pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.710777 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 20:18:34 crc kubenswrapper[4802]: I1201 20:18:34.735778 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1305c34-0728-4332-91d4-2d89af98017c" path="/var/lib/kubelet/pods/e1305c34-0728-4332-91d4-2d89af98017c/volumes" Dec 01 20:18:35 crc kubenswrapper[4802]: I1201 20:18:35.362130 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 20:18:35 crc kubenswrapper[4802]: W1201 20:18:35.366475 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod372989c2_e54c_4031_9b41_926f7be64266.slice/crio-5c33c4572d1e23c2e078244ca5a7a28e7654f0c594b562c3648edbfd57d11f7e WatchSource:0}: Error finding container 5c33c4572d1e23c2e078244ca5a7a28e7654f0c594b562c3648edbfd57d11f7e: Status 404 returned error can't find the container with id 5c33c4572d1e23c2e078244ca5a7a28e7654f0c594b562c3648edbfd57d11f7e Dec 01 20:18:35 crc kubenswrapper[4802]: I1201 20:18:35.951733 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"372989c2-e54c-4031-9b41-926f7be64266","Type":"ContainerStarted","Data":"5c33c4572d1e23c2e078244ca5a7a28e7654f0c594b562c3648edbfd57d11f7e"} Dec 01 20:18:36 crc kubenswrapper[4802]: I1201 20:18:36.395029 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-69b4dccd58-q9lk2" Dec 01 20:18:36 crc kubenswrapper[4802]: I1201 20:18:36.671187 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-69b4dccd58-q9lk2" Dec 01 20:18:36 crc kubenswrapper[4802]: I1201 20:18:36.751196 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-86bd8cd97b-zmmcw"] Dec 01 20:18:36 crc kubenswrapper[4802]: I1201 20:18:36.751486 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-86bd8cd97b-zmmcw" podUID="a305133e-e548-465e-bee2-abc86b1e8fe2" containerName="barbican-api-log" containerID="cri-o://c563583f3577044aaef5e23ce677aae73aa47ec60b8eb9fcbbe714c39dbb2a7c" gracePeriod=30 Dec 01 20:18:36 crc kubenswrapper[4802]: I1201 20:18:36.751763 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-86bd8cd97b-zmmcw" podUID="a305133e-e548-465e-bee2-abc86b1e8fe2" containerName="barbican-api" containerID="cri-o://cab1e6c7bc904080db8d1ef931dfb03d80e3ee29cfd8b1872627111484f2e6e8" gracePeriod=30 Dec 01 20:18:37 crc kubenswrapper[4802]: I1201 20:18:37.038825 4802 generic.go:334] "Generic (PLEG): container finished" podID="a305133e-e548-465e-bee2-abc86b1e8fe2" containerID="c563583f3577044aaef5e23ce677aae73aa47ec60b8eb9fcbbe714c39dbb2a7c" exitCode=143 Dec 01 20:18:37 crc kubenswrapper[4802]: I1201 20:18:37.038887 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86bd8cd97b-zmmcw" event={"ID":"a305133e-e548-465e-bee2-abc86b1e8fe2","Type":"ContainerDied","Data":"c563583f3577044aaef5e23ce677aae73aa47ec60b8eb9fcbbe714c39dbb2a7c"} Dec 01 20:18:37 crc kubenswrapper[4802]: I1201 20:18:37.048715 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"372989c2-e54c-4031-9b41-926f7be64266","Type":"ContainerStarted","Data":"9dddbe19ea766cc399705bbda982baa6750ada7898d5f627812bc0115f403f80"} Dec 01 20:18:37 crc kubenswrapper[4802]: I1201 20:18:37.048757 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"372989c2-e54c-4031-9b41-926f7be64266","Type":"ContainerStarted","Data":"03a03b7ee9b9da9321d9f63541ea9da273705c773bf11ecf073ebbbedf4ae6bf"} Dec 01 20:18:37 crc kubenswrapper[4802]: I1201 20:18:37.048819 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 01 20:18:37 crc kubenswrapper[4802]: I1201 20:18:37.089541 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.089523377 podStartE2EDuration="3.089523377s" podCreationTimestamp="2025-12-01 20:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:18:37.074672582 +0000 UTC m=+1338.637232223" watchObservedRunningTime="2025-12-01 20:18:37.089523377 +0000 UTC m=+1338.652083018" Dec 01 20:18:38 crc kubenswrapper[4802]: I1201 20:18:38.566236 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7977dfdfb6-dnr99" Dec 01 20:18:39 crc kubenswrapper[4802]: I1201 20:18:39.541830 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d97fcdd8f-dzvvs" Dec 01 20:18:39 crc kubenswrapper[4802]: I1201 20:18:39.634661 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-26ssz"] Dec 01 20:18:39 crc kubenswrapper[4802]: I1201 20:18:39.634983 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7987f74bbc-26ssz" podUID="6aac0af2-647b-4f69-b07a-96bdd62c6e3d" containerName="dnsmasq-dns" containerID="cri-o://2a15361e2bc6d911d2d8f09c6971d0d8e01c81e641e2303058f26a9346425b49" gracePeriod=10 Dec 01 20:18:39 crc kubenswrapper[4802]: I1201 20:18:39.675600 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 01 20:18:39 crc kubenswrapper[4802]: I1201 20:18:39.720618 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 20:18:39 crc kubenswrapper[4802]: E1201 20:18:39.787492 4802 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6aac0af2_647b_4f69_b07a_96bdd62c6e3d.slice/crio-conmon-2a15361e2bc6d911d2d8f09c6971d0d8e01c81e641e2303058f26a9346425b49.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6aac0af2_647b_4f69_b07a_96bdd62c6e3d.slice/crio-2a15361e2bc6d911d2d8f09c6971d0d8e01c81e641e2303058f26a9346425b49.scope\": RecentStats: unable to find data in memory cache]" Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.103696 4802 generic.go:334] "Generic (PLEG): container finished" podID="6aac0af2-647b-4f69-b07a-96bdd62c6e3d" containerID="2a15361e2bc6d911d2d8f09c6971d0d8e01c81e641e2303058f26a9346425b49" exitCode=0 Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.103775 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-26ssz" event={"ID":"6aac0af2-647b-4f69-b07a-96bdd62c6e3d","Type":"ContainerDied","Data":"2a15361e2bc6d911d2d8f09c6971d0d8e01c81e641e2303058f26a9346425b49"} Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.104138 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a54d4833-4ceb-4479-8230-f40e3bef89b2" containerName="cinder-scheduler" containerID="cri-o://3e659d53c40c29a7b65a1de0f4e1a9377e603ff95bac2c74ea41b5940b2ff97d" gracePeriod=30 Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.104177 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a54d4833-4ceb-4479-8230-f40e3bef89b2" containerName="probe" containerID="cri-o://0a7f20f037d4ef7e92dec63fc6495e11a196653577e8e87b9e57a00915c4631c" gracePeriod=30 Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.303634 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-26ssz" Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.340285 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-config\") pod \"6aac0af2-647b-4f69-b07a-96bdd62c6e3d\" (UID: \"6aac0af2-647b-4f69-b07a-96bdd62c6e3d\") " Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.340522 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-ovsdbserver-nb\") pod \"6aac0af2-647b-4f69-b07a-96bdd62c6e3d\" (UID: \"6aac0af2-647b-4f69-b07a-96bdd62c6e3d\") " Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.340558 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-dns-svc\") pod \"6aac0af2-647b-4f69-b07a-96bdd62c6e3d\" (UID: \"6aac0af2-647b-4f69-b07a-96bdd62c6e3d\") " Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.340659 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-ovsdbserver-sb\") pod \"6aac0af2-647b-4f69-b07a-96bdd62c6e3d\" (UID: \"6aac0af2-647b-4f69-b07a-96bdd62c6e3d\") " Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.340716 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh7zl\" (UniqueName: \"kubernetes.io/projected/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-kube-api-access-qh7zl\") pod \"6aac0af2-647b-4f69-b07a-96bdd62c6e3d\" (UID: \"6aac0af2-647b-4f69-b07a-96bdd62c6e3d\") " Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.353458 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-kube-api-access-qh7zl" (OuterVolumeSpecName: "kube-api-access-qh7zl") pod "6aac0af2-647b-4f69-b07a-96bdd62c6e3d" (UID: "6aac0af2-647b-4f69-b07a-96bdd62c6e3d"). InnerVolumeSpecName "kube-api-access-qh7zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.397114 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6aac0af2-647b-4f69-b07a-96bdd62c6e3d" (UID: "6aac0af2-647b-4f69-b07a-96bdd62c6e3d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.412192 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6aac0af2-647b-4f69-b07a-96bdd62c6e3d" (UID: "6aac0af2-647b-4f69-b07a-96bdd62c6e3d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.440446 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-config" (OuterVolumeSpecName: "config") pod "6aac0af2-647b-4f69-b07a-96bdd62c6e3d" (UID: "6aac0af2-647b-4f69-b07a-96bdd62c6e3d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.446043 4802 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.446123 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh7zl\" (UniqueName: \"kubernetes.io/projected/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-kube-api-access-qh7zl\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.446192 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.446265 4802 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.450676 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6aac0af2-647b-4f69-b07a-96bdd62c6e3d" (UID: "6aac0af2-647b-4f69-b07a-96bdd62c6e3d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.457907 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86bd8cd97b-zmmcw" Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.547011 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spzc5\" (UniqueName: \"kubernetes.io/projected/a305133e-e548-465e-bee2-abc86b1e8fe2-kube-api-access-spzc5\") pod \"a305133e-e548-465e-bee2-abc86b1e8fe2\" (UID: \"a305133e-e548-465e-bee2-abc86b1e8fe2\") " Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.547095 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a305133e-e548-465e-bee2-abc86b1e8fe2-logs\") pod \"a305133e-e548-465e-bee2-abc86b1e8fe2\" (UID: \"a305133e-e548-465e-bee2-abc86b1e8fe2\") " Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.547154 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a305133e-e548-465e-bee2-abc86b1e8fe2-config-data-custom\") pod \"a305133e-e548-465e-bee2-abc86b1e8fe2\" (UID: \"a305133e-e548-465e-bee2-abc86b1e8fe2\") " Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.547219 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a305133e-e548-465e-bee2-abc86b1e8fe2-combined-ca-bundle\") pod \"a305133e-e548-465e-bee2-abc86b1e8fe2\" (UID: \"a305133e-e548-465e-bee2-abc86b1e8fe2\") " Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.547296 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a305133e-e548-465e-bee2-abc86b1e8fe2-config-data\") pod \"a305133e-e548-465e-bee2-abc86b1e8fe2\" (UID: \"a305133e-e548-465e-bee2-abc86b1e8fe2\") " Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.547909 4802 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6aac0af2-647b-4f69-b07a-96bdd62c6e3d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.548176 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a305133e-e548-465e-bee2-abc86b1e8fe2-logs" (OuterVolumeSpecName: "logs") pod "a305133e-e548-465e-bee2-abc86b1e8fe2" (UID: "a305133e-e548-465e-bee2-abc86b1e8fe2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.550613 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a305133e-e548-465e-bee2-abc86b1e8fe2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a305133e-e548-465e-bee2-abc86b1e8fe2" (UID: "a305133e-e548-465e-bee2-abc86b1e8fe2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.550781 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a305133e-e548-465e-bee2-abc86b1e8fe2-kube-api-access-spzc5" (OuterVolumeSpecName: "kube-api-access-spzc5") pod "a305133e-e548-465e-bee2-abc86b1e8fe2" (UID: "a305133e-e548-465e-bee2-abc86b1e8fe2"). InnerVolumeSpecName "kube-api-access-spzc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.575131 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a305133e-e548-465e-bee2-abc86b1e8fe2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a305133e-e548-465e-bee2-abc86b1e8fe2" (UID: "a305133e-e548-465e-bee2-abc86b1e8fe2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.607245 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a305133e-e548-465e-bee2-abc86b1e8fe2-config-data" (OuterVolumeSpecName: "config-data") pod "a305133e-e548-465e-bee2-abc86b1e8fe2" (UID: "a305133e-e548-465e-bee2-abc86b1e8fe2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.649626 4802 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a305133e-e548-465e-bee2-abc86b1e8fe2-logs\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.649657 4802 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a305133e-e548-465e-bee2-abc86b1e8fe2-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.649669 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a305133e-e548-465e-bee2-abc86b1e8fe2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.649679 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a305133e-e548-465e-bee2-abc86b1e8fe2-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:40 crc kubenswrapper[4802]: I1201 20:18:40.649688 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spzc5\" (UniqueName: \"kubernetes.io/projected/a305133e-e548-465e-bee2-abc86b1e8fe2-kube-api-access-spzc5\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.119750 4802 generic.go:334] "Generic (PLEG): container finished" podID="a54d4833-4ceb-4479-8230-f40e3bef89b2" containerID="0a7f20f037d4ef7e92dec63fc6495e11a196653577e8e87b9e57a00915c4631c" exitCode=0 Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.119814 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a54d4833-4ceb-4479-8230-f40e3bef89b2","Type":"ContainerDied","Data":"0a7f20f037d4ef7e92dec63fc6495e11a196653577e8e87b9e57a00915c4631c"} Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.122065 4802 generic.go:334] "Generic (PLEG): container finished" podID="a305133e-e548-465e-bee2-abc86b1e8fe2" containerID="cab1e6c7bc904080db8d1ef931dfb03d80e3ee29cfd8b1872627111484f2e6e8" exitCode=0 Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.122118 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86bd8cd97b-zmmcw" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.122125 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86bd8cd97b-zmmcw" event={"ID":"a305133e-e548-465e-bee2-abc86b1e8fe2","Type":"ContainerDied","Data":"cab1e6c7bc904080db8d1ef931dfb03d80e3ee29cfd8b1872627111484f2e6e8"} Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.122148 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86bd8cd97b-zmmcw" event={"ID":"a305133e-e548-465e-bee2-abc86b1e8fe2","Type":"ContainerDied","Data":"9396a50b84792397d3f71a68ce1b718911a8552eabf58b11d4a0471baacb3780"} Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.122165 4802 scope.go:117] "RemoveContainer" containerID="cab1e6c7bc904080db8d1ef931dfb03d80e3ee29cfd8b1872627111484f2e6e8" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.125077 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-26ssz" event={"ID":"6aac0af2-647b-4f69-b07a-96bdd62c6e3d","Type":"ContainerDied","Data":"5822552c8c60ecf7bd611f02b69bac7cd2320c8e11ebccc0cec3dab795cc7afc"} Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.125262 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-26ssz" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.152272 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-86bd8cd97b-zmmcw"] Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.155751 4802 scope.go:117] "RemoveContainer" containerID="c563583f3577044aaef5e23ce677aae73aa47ec60b8eb9fcbbe714c39dbb2a7c" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.165249 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-86bd8cd97b-zmmcw"] Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.172919 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-26ssz"] Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.179735 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-26ssz"] Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.184829 4802 scope.go:117] "RemoveContainer" containerID="cab1e6c7bc904080db8d1ef931dfb03d80e3ee29cfd8b1872627111484f2e6e8" Dec 01 20:18:41 crc kubenswrapper[4802]: E1201 20:18:41.185451 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cab1e6c7bc904080db8d1ef931dfb03d80e3ee29cfd8b1872627111484f2e6e8\": container with ID starting with cab1e6c7bc904080db8d1ef931dfb03d80e3ee29cfd8b1872627111484f2e6e8 not found: ID does not exist" containerID="cab1e6c7bc904080db8d1ef931dfb03d80e3ee29cfd8b1872627111484f2e6e8" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.185487 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cab1e6c7bc904080db8d1ef931dfb03d80e3ee29cfd8b1872627111484f2e6e8"} err="failed to get container status \"cab1e6c7bc904080db8d1ef931dfb03d80e3ee29cfd8b1872627111484f2e6e8\": rpc error: code = NotFound desc = could not find container \"cab1e6c7bc904080db8d1ef931dfb03d80e3ee29cfd8b1872627111484f2e6e8\": container with ID starting with cab1e6c7bc904080db8d1ef931dfb03d80e3ee29cfd8b1872627111484f2e6e8 not found: ID does not exist" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.185511 4802 scope.go:117] "RemoveContainer" containerID="c563583f3577044aaef5e23ce677aae73aa47ec60b8eb9fcbbe714c39dbb2a7c" Dec 01 20:18:41 crc kubenswrapper[4802]: E1201 20:18:41.186644 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c563583f3577044aaef5e23ce677aae73aa47ec60b8eb9fcbbe714c39dbb2a7c\": container with ID starting with c563583f3577044aaef5e23ce677aae73aa47ec60b8eb9fcbbe714c39dbb2a7c not found: ID does not exist" containerID="c563583f3577044aaef5e23ce677aae73aa47ec60b8eb9fcbbe714c39dbb2a7c" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.186696 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c563583f3577044aaef5e23ce677aae73aa47ec60b8eb9fcbbe714c39dbb2a7c"} err="failed to get container status \"c563583f3577044aaef5e23ce677aae73aa47ec60b8eb9fcbbe714c39dbb2a7c\": rpc error: code = NotFound desc = could not find container \"c563583f3577044aaef5e23ce677aae73aa47ec60b8eb9fcbbe714c39dbb2a7c\": container with ID starting with c563583f3577044aaef5e23ce677aae73aa47ec60b8eb9fcbbe714c39dbb2a7c not found: ID does not exist" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.186724 4802 scope.go:117] "RemoveContainer" containerID="2a15361e2bc6d911d2d8f09c6971d0d8e01c81e641e2303058f26a9346425b49" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.205839 4802 scope.go:117] "RemoveContainer" containerID="d47e3f4934f8a4f27df129b98048a3eba034eee49377d00ebe202853d1d06889" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.794453 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 01 20:18:41 crc kubenswrapper[4802]: E1201 20:18:41.794912 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a305133e-e548-465e-bee2-abc86b1e8fe2" containerName="barbican-api-log" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.794927 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="a305133e-e548-465e-bee2-abc86b1e8fe2" containerName="barbican-api-log" Dec 01 20:18:41 crc kubenswrapper[4802]: E1201 20:18:41.794949 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aac0af2-647b-4f69-b07a-96bdd62c6e3d" containerName="init" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.794957 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aac0af2-647b-4f69-b07a-96bdd62c6e3d" containerName="init" Dec 01 20:18:41 crc kubenswrapper[4802]: E1201 20:18:41.794971 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aac0af2-647b-4f69-b07a-96bdd62c6e3d" containerName="dnsmasq-dns" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.794980 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aac0af2-647b-4f69-b07a-96bdd62c6e3d" containerName="dnsmasq-dns" Dec 01 20:18:41 crc kubenswrapper[4802]: E1201 20:18:41.795036 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a305133e-e548-465e-bee2-abc86b1e8fe2" containerName="barbican-api" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.795045 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="a305133e-e548-465e-bee2-abc86b1e8fe2" containerName="barbican-api" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.795303 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="a305133e-e548-465e-bee2-abc86b1e8fe2" containerName="barbican-api-log" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.795338 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aac0af2-647b-4f69-b07a-96bdd62c6e3d" containerName="dnsmasq-dns" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.795353 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="a305133e-e548-465e-bee2-abc86b1e8fe2" containerName="barbican-api" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.796180 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.799405 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.800728 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-sc94c" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.800969 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.809459 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.869887 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e13a2e-794d-4757-8c64-e1895a5e819d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"19e13a2e-794d-4757-8c64-e1895a5e819d\") " pod="openstack/openstackclient" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.870035 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/19e13a2e-794d-4757-8c64-e1895a5e819d-openstack-config\") pod \"openstackclient\" (UID: \"19e13a2e-794d-4757-8c64-e1895a5e819d\") " pod="openstack/openstackclient" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.870082 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlmb2\" (UniqueName: \"kubernetes.io/projected/19e13a2e-794d-4757-8c64-e1895a5e819d-kube-api-access-nlmb2\") pod \"openstackclient\" (UID: \"19e13a2e-794d-4757-8c64-e1895a5e819d\") " pod="openstack/openstackclient" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.870126 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/19e13a2e-794d-4757-8c64-e1895a5e819d-openstack-config-secret\") pod \"openstackclient\" (UID: \"19e13a2e-794d-4757-8c64-e1895a5e819d\") " pod="openstack/openstackclient" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.972191 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/19e13a2e-794d-4757-8c64-e1895a5e819d-openstack-config\") pod \"openstackclient\" (UID: \"19e13a2e-794d-4757-8c64-e1895a5e819d\") " pod="openstack/openstackclient" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.972279 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlmb2\" (UniqueName: \"kubernetes.io/projected/19e13a2e-794d-4757-8c64-e1895a5e819d-kube-api-access-nlmb2\") pod \"openstackclient\" (UID: \"19e13a2e-794d-4757-8c64-e1895a5e819d\") " pod="openstack/openstackclient" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.972323 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/19e13a2e-794d-4757-8c64-e1895a5e819d-openstack-config-secret\") pod \"openstackclient\" (UID: \"19e13a2e-794d-4757-8c64-e1895a5e819d\") " pod="openstack/openstackclient" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.972409 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e13a2e-794d-4757-8c64-e1895a5e819d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"19e13a2e-794d-4757-8c64-e1895a5e819d\") " pod="openstack/openstackclient" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.973070 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/19e13a2e-794d-4757-8c64-e1895a5e819d-openstack-config\") pod \"openstackclient\" (UID: \"19e13a2e-794d-4757-8c64-e1895a5e819d\") " pod="openstack/openstackclient" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.978239 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e13a2e-794d-4757-8c64-e1895a5e819d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"19e13a2e-794d-4757-8c64-e1895a5e819d\") " pod="openstack/openstackclient" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.986931 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/19e13a2e-794d-4757-8c64-e1895a5e819d-openstack-config-secret\") pod \"openstackclient\" (UID: \"19e13a2e-794d-4757-8c64-e1895a5e819d\") " pod="openstack/openstackclient" Dec 01 20:18:41 crc kubenswrapper[4802]: I1201 20:18:41.992913 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlmb2\" (UniqueName: \"kubernetes.io/projected/19e13a2e-794d-4757-8c64-e1895a5e819d-kube-api-access-nlmb2\") pod \"openstackclient\" (UID: \"19e13a2e-794d-4757-8c64-e1895a5e819d\") " pod="openstack/openstackclient" Dec 01 20:18:42 crc kubenswrapper[4802]: I1201 20:18:42.118700 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 20:18:42 crc kubenswrapper[4802]: I1201 20:18:42.573590 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 01 20:18:42 crc kubenswrapper[4802]: I1201 20:18:42.729505 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aac0af2-647b-4f69-b07a-96bdd62c6e3d" path="/var/lib/kubelet/pods/6aac0af2-647b-4f69-b07a-96bdd62c6e3d/volumes" Dec 01 20:18:42 crc kubenswrapper[4802]: I1201 20:18:42.730364 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a305133e-e548-465e-bee2-abc86b1e8fe2" path="/var/lib/kubelet/pods/a305133e-e548-465e-bee2-abc86b1e8fe2/volumes" Dec 01 20:18:43 crc kubenswrapper[4802]: I1201 20:18:43.169218 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"19e13a2e-794d-4757-8c64-e1895a5e819d","Type":"ContainerStarted","Data":"e3f5b8e8b3e298bd4d6f4ebc86588115dd9ff418f5eab8316a1d91ce11fe359b"} Dec 01 20:18:44 crc kubenswrapper[4802]: I1201 20:18:44.211800 4802 generic.go:334] "Generic (PLEG): container finished" podID="a54d4833-4ceb-4479-8230-f40e3bef89b2" containerID="3e659d53c40c29a7b65a1de0f4e1a9377e603ff95bac2c74ea41b5940b2ff97d" exitCode=0 Dec 01 20:18:44 crc kubenswrapper[4802]: I1201 20:18:44.212077 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a54d4833-4ceb-4479-8230-f40e3bef89b2","Type":"ContainerDied","Data":"3e659d53c40c29a7b65a1de0f4e1a9377e603ff95bac2c74ea41b5940b2ff97d"} Dec 01 20:18:44 crc kubenswrapper[4802]: I1201 20:18:44.363160 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 20:18:44 crc kubenswrapper[4802]: I1201 20:18:44.523395 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9b8q\" (UniqueName: \"kubernetes.io/projected/a54d4833-4ceb-4479-8230-f40e3bef89b2-kube-api-access-b9b8q\") pod \"a54d4833-4ceb-4479-8230-f40e3bef89b2\" (UID: \"a54d4833-4ceb-4479-8230-f40e3bef89b2\") " Dec 01 20:18:44 crc kubenswrapper[4802]: I1201 20:18:44.523432 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a54d4833-4ceb-4479-8230-f40e3bef89b2-combined-ca-bundle\") pod \"a54d4833-4ceb-4479-8230-f40e3bef89b2\" (UID: \"a54d4833-4ceb-4479-8230-f40e3bef89b2\") " Dec 01 20:18:44 crc kubenswrapper[4802]: I1201 20:18:44.523475 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a54d4833-4ceb-4479-8230-f40e3bef89b2-scripts\") pod \"a54d4833-4ceb-4479-8230-f40e3bef89b2\" (UID: \"a54d4833-4ceb-4479-8230-f40e3bef89b2\") " Dec 01 20:18:44 crc kubenswrapper[4802]: I1201 20:18:44.523533 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a54d4833-4ceb-4479-8230-f40e3bef89b2-etc-machine-id\") pod \"a54d4833-4ceb-4479-8230-f40e3bef89b2\" (UID: \"a54d4833-4ceb-4479-8230-f40e3bef89b2\") " Dec 01 20:18:44 crc kubenswrapper[4802]: I1201 20:18:44.523576 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a54d4833-4ceb-4479-8230-f40e3bef89b2-config-data\") pod \"a54d4833-4ceb-4479-8230-f40e3bef89b2\" (UID: \"a54d4833-4ceb-4479-8230-f40e3bef89b2\") " Dec 01 20:18:44 crc kubenswrapper[4802]: I1201 20:18:44.523637 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a54d4833-4ceb-4479-8230-f40e3bef89b2-config-data-custom\") pod \"a54d4833-4ceb-4479-8230-f40e3bef89b2\" (UID: \"a54d4833-4ceb-4479-8230-f40e3bef89b2\") " Dec 01 20:18:44 crc kubenswrapper[4802]: I1201 20:18:44.524138 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a54d4833-4ceb-4479-8230-f40e3bef89b2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a54d4833-4ceb-4479-8230-f40e3bef89b2" (UID: "a54d4833-4ceb-4479-8230-f40e3bef89b2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:18:44 crc kubenswrapper[4802]: I1201 20:18:44.537319 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a54d4833-4ceb-4479-8230-f40e3bef89b2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a54d4833-4ceb-4479-8230-f40e3bef89b2" (UID: "a54d4833-4ceb-4479-8230-f40e3bef89b2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:44 crc kubenswrapper[4802]: I1201 20:18:44.553469 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a54d4833-4ceb-4479-8230-f40e3bef89b2-scripts" (OuterVolumeSpecName: "scripts") pod "a54d4833-4ceb-4479-8230-f40e3bef89b2" (UID: "a54d4833-4ceb-4479-8230-f40e3bef89b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:44 crc kubenswrapper[4802]: I1201 20:18:44.553497 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a54d4833-4ceb-4479-8230-f40e3bef89b2-kube-api-access-b9b8q" (OuterVolumeSpecName: "kube-api-access-b9b8q") pod "a54d4833-4ceb-4479-8230-f40e3bef89b2" (UID: "a54d4833-4ceb-4479-8230-f40e3bef89b2"). InnerVolumeSpecName "kube-api-access-b9b8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:18:44 crc kubenswrapper[4802]: I1201 20:18:44.590476 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a54d4833-4ceb-4479-8230-f40e3bef89b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a54d4833-4ceb-4479-8230-f40e3bef89b2" (UID: "a54d4833-4ceb-4479-8230-f40e3bef89b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:44 crc kubenswrapper[4802]: I1201 20:18:44.625910 4802 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a54d4833-4ceb-4479-8230-f40e3bef89b2-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:44 crc kubenswrapper[4802]: I1201 20:18:44.625943 4802 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a54d4833-4ceb-4479-8230-f40e3bef89b2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:44 crc kubenswrapper[4802]: I1201 20:18:44.625955 4802 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a54d4833-4ceb-4479-8230-f40e3bef89b2-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:44 crc kubenswrapper[4802]: I1201 20:18:44.625965 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9b8q\" (UniqueName: \"kubernetes.io/projected/a54d4833-4ceb-4479-8230-f40e3bef89b2-kube-api-access-b9b8q\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:44 crc kubenswrapper[4802]: I1201 20:18:44.625976 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a54d4833-4ceb-4479-8230-f40e3bef89b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:44 crc kubenswrapper[4802]: I1201 20:18:44.643414 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a54d4833-4ceb-4479-8230-f40e3bef89b2-config-data" (OuterVolumeSpecName: "config-data") pod "a54d4833-4ceb-4479-8230-f40e3bef89b2" (UID: "a54d4833-4ceb-4479-8230-f40e3bef89b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:44 crc kubenswrapper[4802]: I1201 20:18:44.727328 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a54d4833-4ceb-4479-8230-f40e3bef89b2-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:44 crc kubenswrapper[4802]: I1201 20:18:44.824780 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-646fbc85dd-2ttbm" Dec 01 20:18:44 crc kubenswrapper[4802]: I1201 20:18:44.932484 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-646fbc85dd-2ttbm" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.223667 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.224364 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a54d4833-4ceb-4479-8230-f40e3bef89b2","Type":"ContainerDied","Data":"6ba975b8780b0250296c6499ab5a5864dccf1dac713c043b9465aae286697807"} Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.224394 4802 scope.go:117] "RemoveContainer" containerID="0a7f20f037d4ef7e92dec63fc6495e11a196653577e8e87b9e57a00915c4631c" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.247340 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.251217 4802 scope.go:117] "RemoveContainer" containerID="3e659d53c40c29a7b65a1de0f4e1a9377e603ff95bac2c74ea41b5940b2ff97d" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.255975 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.266219 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 20:18:45 crc kubenswrapper[4802]: E1201 20:18:45.266708 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a54d4833-4ceb-4479-8230-f40e3bef89b2" containerName="probe" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.266721 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="a54d4833-4ceb-4479-8230-f40e3bef89b2" containerName="probe" Dec 01 20:18:45 crc kubenswrapper[4802]: E1201 20:18:45.266744 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a54d4833-4ceb-4479-8230-f40e3bef89b2" containerName="cinder-scheduler" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.266752 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="a54d4833-4ceb-4479-8230-f40e3bef89b2" containerName="cinder-scheduler" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.266944 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="a54d4833-4ceb-4479-8230-f40e3bef89b2" containerName="probe" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.266970 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="a54d4833-4ceb-4479-8230-f40e3bef89b2" containerName="cinder-scheduler" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.267891 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.281122 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.292778 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.440040 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.440089 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6rz6\" (UniqueName: \"kubernetes.io/projected/a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0-kube-api-access-c6rz6\") pod \"cinder-scheduler-0\" (UID: \"a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.440128 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.440479 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0-scripts\") pod \"cinder-scheduler-0\" (UID: \"a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.440669 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0-config-data\") pod \"cinder-scheduler-0\" (UID: \"a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.440707 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.541997 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0-scripts\") pod \"cinder-scheduler-0\" (UID: \"a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.542074 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0-config-data\") pod \"cinder-scheduler-0\" (UID: \"a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.542154 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.542186 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.542246 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6rz6\" (UniqueName: \"kubernetes.io/projected/a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0-kube-api-access-c6rz6\") pod \"cinder-scheduler-0\" (UID: \"a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.542279 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.542357 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.549065 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0-scripts\") pod \"cinder-scheduler-0\" (UID: \"a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.549834 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.550047 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0-config-data\") pod \"cinder-scheduler-0\" (UID: \"a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.555034 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.567107 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6rz6\" (UniqueName: \"kubernetes.io/projected/a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0-kube-api-access-c6rz6\") pod \"cinder-scheduler-0\" (UID: \"a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0\") " pod="openstack/cinder-scheduler-0" Dec 01 20:18:45 crc kubenswrapper[4802]: I1201 20:18:45.594924 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 20:18:46 crc kubenswrapper[4802]: I1201 20:18:46.086428 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 20:18:46 crc kubenswrapper[4802]: I1201 20:18:46.234605 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0","Type":"ContainerStarted","Data":"5b9ada509ee30e5c82aa427db0f523fa711eec5c16587d6026dd03e4c9be44c7"} Dec 01 20:18:46 crc kubenswrapper[4802]: I1201 20:18:46.275037 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5967795cb6-dtwwt" Dec 01 20:18:46 crc kubenswrapper[4802]: I1201 20:18:46.736005 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a54d4833-4ceb-4479-8230-f40e3bef89b2" path="/var/lib/kubelet/pods/a54d4833-4ceb-4479-8230-f40e3bef89b2/volumes" Dec 01 20:18:46 crc kubenswrapper[4802]: I1201 20:18:46.872471 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 01 20:18:47 crc kubenswrapper[4802]: I1201 20:18:47.245936 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0","Type":"ContainerStarted","Data":"b0f974410c59e040ab23f53bf77d4b628988d3a853a86cc1d73f82097852231d"} Dec 01 20:18:48 crc kubenswrapper[4802]: I1201 20:18:48.258392 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0","Type":"ContainerStarted","Data":"d5ebecfc4597582522df71a170ac8a02686654495c270a6ce2597ee6e9faaea4"} Dec 01 20:18:48 crc kubenswrapper[4802]: I1201 20:18:48.295785 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.29573934 podStartE2EDuration="3.29573934s" podCreationTimestamp="2025-12-01 20:18:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:18:48.287477451 +0000 UTC m=+1349.850037102" watchObservedRunningTime="2025-12-01 20:18:48.29573934 +0000 UTC m=+1349.858298981" Dec 01 20:18:48 crc kubenswrapper[4802]: I1201 20:18:48.542362 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-84c4b4b5d7-2ph8r" Dec 01 20:18:48 crc kubenswrapper[4802]: I1201 20:18:48.607440 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5967795cb6-dtwwt"] Dec 01 20:18:48 crc kubenswrapper[4802]: I1201 20:18:48.607656 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5967795cb6-dtwwt" podUID="d67137e5-b9ea-4ca0-8851-a7f041ad745e" containerName="neutron-api" containerID="cri-o://a00a0d7991d7381f3b22c0ca3b8cccab6dd6c98f0dab5c53cf683ed0aebf5d75" gracePeriod=30 Dec 01 20:18:48 crc kubenswrapper[4802]: I1201 20:18:48.607877 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5967795cb6-dtwwt" podUID="d67137e5-b9ea-4ca0-8851-a7f041ad745e" containerName="neutron-httpd" containerID="cri-o://bb171adba622a509decad3632fd36cd6a4d820670ea158a5d1f714bc72223b49" gracePeriod=30 Dec 01 20:18:49 crc kubenswrapper[4802]: I1201 20:18:49.254778 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 20:18:49 crc kubenswrapper[4802]: I1201 20:18:49.282836 4802 generic.go:334] "Generic (PLEG): container finished" podID="d67137e5-b9ea-4ca0-8851-a7f041ad745e" containerID="bb171adba622a509decad3632fd36cd6a4d820670ea158a5d1f714bc72223b49" exitCode=0 Dec 01 20:18:49 crc kubenswrapper[4802]: I1201 20:18:49.282904 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5967795cb6-dtwwt" event={"ID":"d67137e5-b9ea-4ca0-8851-a7f041ad745e","Type":"ContainerDied","Data":"bb171adba622a509decad3632fd36cd6a4d820670ea158a5d1f714bc72223b49"} Dec 01 20:18:50 crc kubenswrapper[4802]: I1201 20:18:50.595770 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.332424 4802 generic.go:334] "Generic (PLEG): container finished" podID="d67137e5-b9ea-4ca0-8851-a7f041ad745e" containerID="a00a0d7991d7381f3b22c0ca3b8cccab6dd6c98f0dab5c53cf683ed0aebf5d75" exitCode=0 Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.332480 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5967795cb6-dtwwt" event={"ID":"d67137e5-b9ea-4ca0-8851-a7f041ad745e","Type":"ContainerDied","Data":"a00a0d7991d7381f3b22c0ca3b8cccab6dd6c98f0dab5c53cf683ed0aebf5d75"} Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.420120 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-qplvx"] Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.421296 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qplvx" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.435039 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-qplvx"] Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.533207 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-2kqwl"] Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.534594 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2kqwl" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.541792 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-aa02-account-create-update-tf6q4"] Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.543102 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-aa02-account-create-update-tf6q4" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.545092 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.551973 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-2kqwl"] Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.562996 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-aa02-account-create-update-tf6q4"] Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.614094 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e-operator-scripts\") pod \"nova-api-db-create-qplvx\" (UID: \"c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e\") " pod="openstack/nova-api-db-create-qplvx" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.614197 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpsh7\" (UniqueName: \"kubernetes.io/projected/c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e-kube-api-access-gpsh7\") pod \"nova-api-db-create-qplvx\" (UID: \"c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e\") " pod="openstack/nova-api-db-create-qplvx" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.717266 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e-operator-scripts\") pod \"nova-api-db-create-qplvx\" (UID: \"c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e\") " pod="openstack/nova-api-db-create-qplvx" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.717330 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b26bcb68-81a2-43fa-b81f-29dbc8ca213e-operator-scripts\") pod \"nova-cell0-db-create-2kqwl\" (UID: \"b26bcb68-81a2-43fa-b81f-29dbc8ca213e\") " pod="openstack/nova-cell0-db-create-2kqwl" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.717376 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbj6f\" (UniqueName: \"kubernetes.io/projected/b26bcb68-81a2-43fa-b81f-29dbc8ca213e-kube-api-access-xbj6f\") pod \"nova-cell0-db-create-2kqwl\" (UID: \"b26bcb68-81a2-43fa-b81f-29dbc8ca213e\") " pod="openstack/nova-cell0-db-create-2kqwl" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.717426 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efc9020e-4f69-49c4-9788-aa84f98d8c77-operator-scripts\") pod \"nova-api-aa02-account-create-update-tf6q4\" (UID: \"efc9020e-4f69-49c4-9788-aa84f98d8c77\") " pod="openstack/nova-api-aa02-account-create-update-tf6q4" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.717453 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpsh7\" (UniqueName: \"kubernetes.io/projected/c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e-kube-api-access-gpsh7\") pod \"nova-api-db-create-qplvx\" (UID: \"c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e\") " pod="openstack/nova-api-db-create-qplvx" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.717500 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86p6v\" (UniqueName: \"kubernetes.io/projected/efc9020e-4f69-49c4-9788-aa84f98d8c77-kube-api-access-86p6v\") pod \"nova-api-aa02-account-create-update-tf6q4\" (UID: \"efc9020e-4f69-49c4-9788-aa84f98d8c77\") " pod="openstack/nova-api-aa02-account-create-update-tf6q4" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.718476 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e-operator-scripts\") pod \"nova-api-db-create-qplvx\" (UID: \"c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e\") " pod="openstack/nova-api-db-create-qplvx" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.743320 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-xf6km"] Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.744564 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-xf6km" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.745692 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-xf6km"] Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.759370 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpsh7\" (UniqueName: \"kubernetes.io/projected/c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e-kube-api-access-gpsh7\") pod \"nova-api-db-create-qplvx\" (UID: \"c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e\") " pod="openstack/nova-api-db-create-qplvx" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.768306 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-ee5c-account-create-update-xw4zq"] Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.769600 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ee5c-account-create-update-xw4zq" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.771784 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.774103 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ee5c-account-create-update-xw4zq"] Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.820233 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b26bcb68-81a2-43fa-b81f-29dbc8ca213e-operator-scripts\") pod \"nova-cell0-db-create-2kqwl\" (UID: \"b26bcb68-81a2-43fa-b81f-29dbc8ca213e\") " pod="openstack/nova-cell0-db-create-2kqwl" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.820320 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbj6f\" (UniqueName: \"kubernetes.io/projected/b26bcb68-81a2-43fa-b81f-29dbc8ca213e-kube-api-access-xbj6f\") pod \"nova-cell0-db-create-2kqwl\" (UID: \"b26bcb68-81a2-43fa-b81f-29dbc8ca213e\") " pod="openstack/nova-cell0-db-create-2kqwl" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.820403 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efc9020e-4f69-49c4-9788-aa84f98d8c77-operator-scripts\") pod \"nova-api-aa02-account-create-update-tf6q4\" (UID: \"efc9020e-4f69-49c4-9788-aa84f98d8c77\") " pod="openstack/nova-api-aa02-account-create-update-tf6q4" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.820493 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86p6v\" (UniqueName: \"kubernetes.io/projected/efc9020e-4f69-49c4-9788-aa84f98d8c77-kube-api-access-86p6v\") pod \"nova-api-aa02-account-create-update-tf6q4\" (UID: \"efc9020e-4f69-49c4-9788-aa84f98d8c77\") " pod="openstack/nova-api-aa02-account-create-update-tf6q4" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.821706 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b26bcb68-81a2-43fa-b81f-29dbc8ca213e-operator-scripts\") pod \"nova-cell0-db-create-2kqwl\" (UID: \"b26bcb68-81a2-43fa-b81f-29dbc8ca213e\") " pod="openstack/nova-cell0-db-create-2kqwl" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.824330 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efc9020e-4f69-49c4-9788-aa84f98d8c77-operator-scripts\") pod \"nova-api-aa02-account-create-update-tf6q4\" (UID: \"efc9020e-4f69-49c4-9788-aa84f98d8c77\") " pod="openstack/nova-api-aa02-account-create-update-tf6q4" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.838648 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86p6v\" (UniqueName: \"kubernetes.io/projected/efc9020e-4f69-49c4-9788-aa84f98d8c77-kube-api-access-86p6v\") pod \"nova-api-aa02-account-create-update-tf6q4\" (UID: \"efc9020e-4f69-49c4-9788-aa84f98d8c77\") " pod="openstack/nova-api-aa02-account-create-update-tf6q4" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.839880 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbj6f\" (UniqueName: \"kubernetes.io/projected/b26bcb68-81a2-43fa-b81f-29dbc8ca213e-kube-api-access-xbj6f\") pod \"nova-cell0-db-create-2kqwl\" (UID: \"b26bcb68-81a2-43fa-b81f-29dbc8ca213e\") " pod="openstack/nova-cell0-db-create-2kqwl" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.864721 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2kqwl" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.872825 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-aa02-account-create-update-tf6q4" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.923340 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a13d5208-da5b-4c02-84e5-871547bbafbf-operator-scripts\") pod \"nova-cell1-db-create-xf6km\" (UID: \"a13d5208-da5b-4c02-84e5-871547bbafbf\") " pod="openstack/nova-cell1-db-create-xf6km" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.923393 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc3b62ea-1f25-4221-b09f-5a915164fa80-operator-scripts\") pod \"nova-cell0-ee5c-account-create-update-xw4zq\" (UID: \"fc3b62ea-1f25-4221-b09f-5a915164fa80\") " pod="openstack/nova-cell0-ee5c-account-create-update-xw4zq" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.923427 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzgjp\" (UniqueName: \"kubernetes.io/projected/fc3b62ea-1f25-4221-b09f-5a915164fa80-kube-api-access-dzgjp\") pod \"nova-cell0-ee5c-account-create-update-xw4zq\" (UID: \"fc3b62ea-1f25-4221-b09f-5a915164fa80\") " pod="openstack/nova-cell0-ee5c-account-create-update-xw4zq" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.923478 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48rj2\" (UniqueName: \"kubernetes.io/projected/a13d5208-da5b-4c02-84e5-871547bbafbf-kube-api-access-48rj2\") pod \"nova-cell1-db-create-xf6km\" (UID: \"a13d5208-da5b-4c02-84e5-871547bbafbf\") " pod="openstack/nova-cell1-db-create-xf6km" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.933361 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.933617 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b30d2288-725d-4cdf-9297-ad9afee36691" containerName="ceilometer-central-agent" containerID="cri-o://71ac1ff2a4257fa2f29d0b8aa78af67bf6d8eaf62e1e74a89dfd306d253005bd" gracePeriod=30 Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.933752 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b30d2288-725d-4cdf-9297-ad9afee36691" containerName="proxy-httpd" containerID="cri-o://f8c2d42f6e0fb964039402c15d7342ab4eb95244ca07a3203b733863945c9c16" gracePeriod=30 Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.933795 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b30d2288-725d-4cdf-9297-ad9afee36691" containerName="sg-core" containerID="cri-o://3066aab7dd39a086530bfefe431df425a39c642557553fa121be50b305dd35ab" gracePeriod=30 Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.933826 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b30d2288-725d-4cdf-9297-ad9afee36691" containerName="ceilometer-notification-agent" containerID="cri-o://8c2b523f8f216da1a8d6ae97afe906bc0c00bcca6a8d031e6b4f181b0da6d9e7" gracePeriod=30 Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.961805 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-3c20-account-create-update-frnjt"] Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.962965 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3c20-account-create-update-frnjt" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.969711 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 01 20:18:52 crc kubenswrapper[4802]: I1201 20:18:52.981484 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3c20-account-create-update-frnjt"] Dec 01 20:18:53 crc kubenswrapper[4802]: I1201 20:18:53.025190 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzgjp\" (UniqueName: \"kubernetes.io/projected/fc3b62ea-1f25-4221-b09f-5a915164fa80-kube-api-access-dzgjp\") pod \"nova-cell0-ee5c-account-create-update-xw4zq\" (UID: \"fc3b62ea-1f25-4221-b09f-5a915164fa80\") " pod="openstack/nova-cell0-ee5c-account-create-update-xw4zq" Dec 01 20:18:53 crc kubenswrapper[4802]: I1201 20:18:53.025337 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48rj2\" (UniqueName: \"kubernetes.io/projected/a13d5208-da5b-4c02-84e5-871547bbafbf-kube-api-access-48rj2\") pod \"nova-cell1-db-create-xf6km\" (UID: \"a13d5208-da5b-4c02-84e5-871547bbafbf\") " pod="openstack/nova-cell1-db-create-xf6km" Dec 01 20:18:53 crc kubenswrapper[4802]: I1201 20:18:53.025486 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a13d5208-da5b-4c02-84e5-871547bbafbf-operator-scripts\") pod \"nova-cell1-db-create-xf6km\" (UID: \"a13d5208-da5b-4c02-84e5-871547bbafbf\") " pod="openstack/nova-cell1-db-create-xf6km" Dec 01 20:18:53 crc kubenswrapper[4802]: I1201 20:18:53.025521 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc3b62ea-1f25-4221-b09f-5a915164fa80-operator-scripts\") pod \"nova-cell0-ee5c-account-create-update-xw4zq\" (UID: \"fc3b62ea-1f25-4221-b09f-5a915164fa80\") " pod="openstack/nova-cell0-ee5c-account-create-update-xw4zq" Dec 01 20:18:53 crc kubenswrapper[4802]: I1201 20:18:53.026635 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc3b62ea-1f25-4221-b09f-5a915164fa80-operator-scripts\") pod \"nova-cell0-ee5c-account-create-update-xw4zq\" (UID: \"fc3b62ea-1f25-4221-b09f-5a915164fa80\") " pod="openstack/nova-cell0-ee5c-account-create-update-xw4zq" Dec 01 20:18:53 crc kubenswrapper[4802]: I1201 20:18:53.028366 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a13d5208-da5b-4c02-84e5-871547bbafbf-operator-scripts\") pod \"nova-cell1-db-create-xf6km\" (UID: \"a13d5208-da5b-4c02-84e5-871547bbafbf\") " pod="openstack/nova-cell1-db-create-xf6km" Dec 01 20:18:53 crc kubenswrapper[4802]: I1201 20:18:53.043467 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qplvx" Dec 01 20:18:53 crc kubenswrapper[4802]: I1201 20:18:53.049650 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48rj2\" (UniqueName: \"kubernetes.io/projected/a13d5208-da5b-4c02-84e5-871547bbafbf-kube-api-access-48rj2\") pod \"nova-cell1-db-create-xf6km\" (UID: \"a13d5208-da5b-4c02-84e5-871547bbafbf\") " pod="openstack/nova-cell1-db-create-xf6km" Dec 01 20:18:53 crc kubenswrapper[4802]: I1201 20:18:53.054229 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzgjp\" (UniqueName: \"kubernetes.io/projected/fc3b62ea-1f25-4221-b09f-5a915164fa80-kube-api-access-dzgjp\") pod \"nova-cell0-ee5c-account-create-update-xw4zq\" (UID: \"fc3b62ea-1f25-4221-b09f-5a915164fa80\") " pod="openstack/nova-cell0-ee5c-account-create-update-xw4zq" Dec 01 20:18:53 crc kubenswrapper[4802]: I1201 20:18:53.121667 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-xf6km" Dec 01 20:18:53 crc kubenswrapper[4802]: I1201 20:18:53.127258 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxh26\" (UniqueName: \"kubernetes.io/projected/53cf7526-8d6a-4789-8582-854087ec7b2b-kube-api-access-gxh26\") pod \"nova-cell1-3c20-account-create-update-frnjt\" (UID: \"53cf7526-8d6a-4789-8582-854087ec7b2b\") " pod="openstack/nova-cell1-3c20-account-create-update-frnjt" Dec 01 20:18:53 crc kubenswrapper[4802]: I1201 20:18:53.127310 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53cf7526-8d6a-4789-8582-854087ec7b2b-operator-scripts\") pod \"nova-cell1-3c20-account-create-update-frnjt\" (UID: \"53cf7526-8d6a-4789-8582-854087ec7b2b\") " pod="openstack/nova-cell1-3c20-account-create-update-frnjt" Dec 01 20:18:53 crc kubenswrapper[4802]: I1201 20:18:53.228626 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxh26\" (UniqueName: \"kubernetes.io/projected/53cf7526-8d6a-4789-8582-854087ec7b2b-kube-api-access-gxh26\") pod \"nova-cell1-3c20-account-create-update-frnjt\" (UID: \"53cf7526-8d6a-4789-8582-854087ec7b2b\") " pod="openstack/nova-cell1-3c20-account-create-update-frnjt" Dec 01 20:18:53 crc kubenswrapper[4802]: I1201 20:18:53.229038 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53cf7526-8d6a-4789-8582-854087ec7b2b-operator-scripts\") pod \"nova-cell1-3c20-account-create-update-frnjt\" (UID: \"53cf7526-8d6a-4789-8582-854087ec7b2b\") " pod="openstack/nova-cell1-3c20-account-create-update-frnjt" Dec 01 20:18:53 crc kubenswrapper[4802]: I1201 20:18:53.230166 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53cf7526-8d6a-4789-8582-854087ec7b2b-operator-scripts\") pod \"nova-cell1-3c20-account-create-update-frnjt\" (UID: \"53cf7526-8d6a-4789-8582-854087ec7b2b\") " pod="openstack/nova-cell1-3c20-account-create-update-frnjt" Dec 01 20:18:53 crc kubenswrapper[4802]: I1201 20:18:53.235808 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ee5c-account-create-update-xw4zq" Dec 01 20:18:53 crc kubenswrapper[4802]: I1201 20:18:53.247922 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxh26\" (UniqueName: \"kubernetes.io/projected/53cf7526-8d6a-4789-8582-854087ec7b2b-kube-api-access-gxh26\") pod \"nova-cell1-3c20-account-create-update-frnjt\" (UID: \"53cf7526-8d6a-4789-8582-854087ec7b2b\") " pod="openstack/nova-cell1-3c20-account-create-update-frnjt" Dec 01 20:18:53 crc kubenswrapper[4802]: I1201 20:18:53.295818 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3c20-account-create-update-frnjt" Dec 01 20:18:53 crc kubenswrapper[4802]: I1201 20:18:53.352126 4802 generic.go:334] "Generic (PLEG): container finished" podID="b30d2288-725d-4cdf-9297-ad9afee36691" containerID="f8c2d42f6e0fb964039402c15d7342ab4eb95244ca07a3203b733863945c9c16" exitCode=0 Dec 01 20:18:53 crc kubenswrapper[4802]: I1201 20:18:53.352159 4802 generic.go:334] "Generic (PLEG): container finished" podID="b30d2288-725d-4cdf-9297-ad9afee36691" containerID="3066aab7dd39a086530bfefe431df425a39c642557553fa121be50b305dd35ab" exitCode=2 Dec 01 20:18:53 crc kubenswrapper[4802]: I1201 20:18:53.352218 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b30d2288-725d-4cdf-9297-ad9afee36691","Type":"ContainerDied","Data":"f8c2d42f6e0fb964039402c15d7342ab4eb95244ca07a3203b733863945c9c16"} Dec 01 20:18:53 crc kubenswrapper[4802]: I1201 20:18:53.352249 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b30d2288-725d-4cdf-9297-ad9afee36691","Type":"ContainerDied","Data":"3066aab7dd39a086530bfefe431df425a39c642557553fa121be50b305dd35ab"} Dec 01 20:18:54 crc kubenswrapper[4802]: I1201 20:18:54.362408 4802 generic.go:334] "Generic (PLEG): container finished" podID="b30d2288-725d-4cdf-9297-ad9afee36691" containerID="8c2b523f8f216da1a8d6ae97afe906bc0c00bcca6a8d031e6b4f181b0da6d9e7" exitCode=0 Dec 01 20:18:54 crc kubenswrapper[4802]: I1201 20:18:54.362448 4802 generic.go:334] "Generic (PLEG): container finished" podID="b30d2288-725d-4cdf-9297-ad9afee36691" containerID="71ac1ff2a4257fa2f29d0b8aa78af67bf6d8eaf62e1e74a89dfd306d253005bd" exitCode=0 Dec 01 20:18:54 crc kubenswrapper[4802]: I1201 20:18:54.362465 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b30d2288-725d-4cdf-9297-ad9afee36691","Type":"ContainerDied","Data":"8c2b523f8f216da1a8d6ae97afe906bc0c00bcca6a8d031e6b4f181b0da6d9e7"} Dec 01 20:18:54 crc kubenswrapper[4802]: I1201 20:18:54.362490 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b30d2288-725d-4cdf-9297-ad9afee36691","Type":"ContainerDied","Data":"71ac1ff2a4257fa2f29d0b8aa78af67bf6d8eaf62e1e74a89dfd306d253005bd"} Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.299570 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.376370 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"19e13a2e-794d-4757-8c64-e1895a5e819d","Type":"ContainerStarted","Data":"a101ae7a907031a2fedb49e90af75d19cbca919ea42f4d11fb5de943de2dfd64"} Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.380785 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b30d2288-725d-4cdf-9297-ad9afee36691-run-httpd\") pod \"b30d2288-725d-4cdf-9297-ad9afee36691\" (UID: \"b30d2288-725d-4cdf-9297-ad9afee36691\") " Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.380923 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b30d2288-725d-4cdf-9297-ad9afee36691-sg-core-conf-yaml\") pod \"b30d2288-725d-4cdf-9297-ad9afee36691\" (UID: \"b30d2288-725d-4cdf-9297-ad9afee36691\") " Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.381078 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvllr\" (UniqueName: \"kubernetes.io/projected/b30d2288-725d-4cdf-9297-ad9afee36691-kube-api-access-nvllr\") pod \"b30d2288-725d-4cdf-9297-ad9afee36691\" (UID: \"b30d2288-725d-4cdf-9297-ad9afee36691\") " Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.381143 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b30d2288-725d-4cdf-9297-ad9afee36691-scripts\") pod \"b30d2288-725d-4cdf-9297-ad9afee36691\" (UID: \"b30d2288-725d-4cdf-9297-ad9afee36691\") " Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.381220 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b30d2288-725d-4cdf-9297-ad9afee36691-combined-ca-bundle\") pod \"b30d2288-725d-4cdf-9297-ad9afee36691\" (UID: \"b30d2288-725d-4cdf-9297-ad9afee36691\") " Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.381292 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b30d2288-725d-4cdf-9297-ad9afee36691-config-data\") pod \"b30d2288-725d-4cdf-9297-ad9afee36691\" (UID: \"b30d2288-725d-4cdf-9297-ad9afee36691\") " Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.381370 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b30d2288-725d-4cdf-9297-ad9afee36691-log-httpd\") pod \"b30d2288-725d-4cdf-9297-ad9afee36691\" (UID: \"b30d2288-725d-4cdf-9297-ad9afee36691\") " Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.382065 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b30d2288-725d-4cdf-9297-ad9afee36691-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b30d2288-725d-4cdf-9297-ad9afee36691" (UID: "b30d2288-725d-4cdf-9297-ad9afee36691"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.382819 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b30d2288-725d-4cdf-9297-ad9afee36691-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b30d2288-725d-4cdf-9297-ad9afee36691" (UID: "b30d2288-725d-4cdf-9297-ad9afee36691"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.387301 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b30d2288-725d-4cdf-9297-ad9afee36691","Type":"ContainerDied","Data":"03f0f77ab56bd49bcccd3a6e64a5040dbed45f4e39e0e2b55a09c49c313286ef"} Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.387365 4802 scope.go:117] "RemoveContainer" containerID="f8c2d42f6e0fb964039402c15d7342ab4eb95244ca07a3203b733863945c9c16" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.387556 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.388476 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b30d2288-725d-4cdf-9297-ad9afee36691-scripts" (OuterVolumeSpecName: "scripts") pod "b30d2288-725d-4cdf-9297-ad9afee36691" (UID: "b30d2288-725d-4cdf-9297-ad9afee36691"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.391965 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b30d2288-725d-4cdf-9297-ad9afee36691-kube-api-access-nvllr" (OuterVolumeSpecName: "kube-api-access-nvllr") pod "b30d2288-725d-4cdf-9297-ad9afee36691" (UID: "b30d2288-725d-4cdf-9297-ad9afee36691"). InnerVolumeSpecName "kube-api-access-nvllr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.426133 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b30d2288-725d-4cdf-9297-ad9afee36691-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b30d2288-725d-4cdf-9297-ad9afee36691" (UID: "b30d2288-725d-4cdf-9297-ad9afee36691"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.427515 4802 scope.go:117] "RemoveContainer" containerID="3066aab7dd39a086530bfefe431df425a39c642557553fa121be50b305dd35ab" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.447211 4802 scope.go:117] "RemoveContainer" containerID="8c2b523f8f216da1a8d6ae97afe906bc0c00bcca6a8d031e6b4f181b0da6d9e7" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.454333 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5967795cb6-dtwwt" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.483433 4802 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b30d2288-725d-4cdf-9297-ad9afee36691-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.483462 4802 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b30d2288-725d-4cdf-9297-ad9afee36691-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.483491 4802 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b30d2288-725d-4cdf-9297-ad9afee36691-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.483500 4802 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b30d2288-725d-4cdf-9297-ad9afee36691-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.483510 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvllr\" (UniqueName: \"kubernetes.io/projected/b30d2288-725d-4cdf-9297-ad9afee36691-kube-api-access-nvllr\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.486712 4802 scope.go:117] "RemoveContainer" containerID="71ac1ff2a4257fa2f29d0b8aa78af67bf6d8eaf62e1e74a89dfd306d253005bd" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.491948 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b30d2288-725d-4cdf-9297-ad9afee36691-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b30d2288-725d-4cdf-9297-ad9afee36691" (UID: "b30d2288-725d-4cdf-9297-ad9afee36691"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.513697 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.152132244 podStartE2EDuration="14.513672977s" podCreationTimestamp="2025-12-01 20:18:41 +0000 UTC" firstStartedPulling="2025-12-01 20:18:42.579839808 +0000 UTC m=+1344.142399449" lastFinishedPulling="2025-12-01 20:18:54.941380551 +0000 UTC m=+1356.503940182" observedRunningTime="2025-12-01 20:18:55.403082875 +0000 UTC m=+1356.965642516" watchObservedRunningTime="2025-12-01 20:18:55.513672977 +0000 UTC m=+1357.076232638" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.524097 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b30d2288-725d-4cdf-9297-ad9afee36691-config-data" (OuterVolumeSpecName: "config-data") pod "b30d2288-725d-4cdf-9297-ad9afee36691" (UID: "b30d2288-725d-4cdf-9297-ad9afee36691"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.583871 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-qplvx"] Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.584575 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d67137e5-b9ea-4ca0-8851-a7f041ad745e-httpd-config\") pod \"d67137e5-b9ea-4ca0-8851-a7f041ad745e\" (UID: \"d67137e5-b9ea-4ca0-8851-a7f041ad745e\") " Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.584775 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8h78t\" (UniqueName: \"kubernetes.io/projected/d67137e5-b9ea-4ca0-8851-a7f041ad745e-kube-api-access-8h78t\") pod \"d67137e5-b9ea-4ca0-8851-a7f041ad745e\" (UID: \"d67137e5-b9ea-4ca0-8851-a7f041ad745e\") " Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.584847 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d67137e5-b9ea-4ca0-8851-a7f041ad745e-config\") pod \"d67137e5-b9ea-4ca0-8851-a7f041ad745e\" (UID: \"d67137e5-b9ea-4ca0-8851-a7f041ad745e\") " Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.584874 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d67137e5-b9ea-4ca0-8851-a7f041ad745e-combined-ca-bundle\") pod \"d67137e5-b9ea-4ca0-8851-a7f041ad745e\" (UID: \"d67137e5-b9ea-4ca0-8851-a7f041ad745e\") " Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.584933 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d67137e5-b9ea-4ca0-8851-a7f041ad745e-ovndb-tls-certs\") pod \"d67137e5-b9ea-4ca0-8851-a7f041ad745e\" (UID: \"d67137e5-b9ea-4ca0-8851-a7f041ad745e\") " Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.585336 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b30d2288-725d-4cdf-9297-ad9afee36691-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.585357 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b30d2288-725d-4cdf-9297-ad9afee36691-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.591416 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d67137e5-b9ea-4ca0-8851-a7f041ad745e-kube-api-access-8h78t" (OuterVolumeSpecName: "kube-api-access-8h78t") pod "d67137e5-b9ea-4ca0-8851-a7f041ad745e" (UID: "d67137e5-b9ea-4ca0-8851-a7f041ad745e"). InnerVolumeSpecName "kube-api-access-8h78t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.597502 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d67137e5-b9ea-4ca0-8851-a7f041ad745e-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "d67137e5-b9ea-4ca0-8851-a7f041ad745e" (UID: "d67137e5-b9ea-4ca0-8851-a7f041ad745e"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.635491 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ee5c-account-create-update-xw4zq"] Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.669824 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d67137e5-b9ea-4ca0-8851-a7f041ad745e-config" (OuterVolumeSpecName: "config") pod "d67137e5-b9ea-4ca0-8851-a7f041ad745e" (UID: "d67137e5-b9ea-4ca0-8851-a7f041ad745e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.673592 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d67137e5-b9ea-4ca0-8851-a7f041ad745e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d67137e5-b9ea-4ca0-8851-a7f041ad745e" (UID: "d67137e5-b9ea-4ca0-8851-a7f041ad745e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.687398 4802 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d67137e5-b9ea-4ca0-8851-a7f041ad745e-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.687422 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8h78t\" (UniqueName: \"kubernetes.io/projected/d67137e5-b9ea-4ca0-8851-a7f041ad745e-kube-api-access-8h78t\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.687433 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d67137e5-b9ea-4ca0-8851-a7f041ad745e-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.687441 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d67137e5-b9ea-4ca0-8851-a7f041ad745e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.709379 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d67137e5-b9ea-4ca0-8851-a7f041ad745e-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "d67137e5-b9ea-4ca0-8851-a7f041ad745e" (UID: "d67137e5-b9ea-4ca0-8851-a7f041ad745e"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.762418 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.789276 4802 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d67137e5-b9ea-4ca0-8851-a7f041ad745e-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.793643 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.815327 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:18:55 crc kubenswrapper[4802]: E1201 20:18:55.816076 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d67137e5-b9ea-4ca0-8851-a7f041ad745e" containerName="neutron-httpd" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.816113 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="d67137e5-b9ea-4ca0-8851-a7f041ad745e" containerName="neutron-httpd" Dec 01 20:18:55 crc kubenswrapper[4802]: E1201 20:18:55.816161 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b30d2288-725d-4cdf-9297-ad9afee36691" containerName="ceilometer-central-agent" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.816169 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="b30d2288-725d-4cdf-9297-ad9afee36691" containerName="ceilometer-central-agent" Dec 01 20:18:55 crc kubenswrapper[4802]: E1201 20:18:55.816212 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b30d2288-725d-4cdf-9297-ad9afee36691" containerName="ceilometer-notification-agent" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.816219 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="b30d2288-725d-4cdf-9297-ad9afee36691" containerName="ceilometer-notification-agent" Dec 01 20:18:55 crc kubenswrapper[4802]: E1201 20:18:55.816230 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b30d2288-725d-4cdf-9297-ad9afee36691" containerName="sg-core" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.816236 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="b30d2288-725d-4cdf-9297-ad9afee36691" containerName="sg-core" Dec 01 20:18:55 crc kubenswrapper[4802]: E1201 20:18:55.816269 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d67137e5-b9ea-4ca0-8851-a7f041ad745e" containerName="neutron-api" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.816277 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="d67137e5-b9ea-4ca0-8851-a7f041ad745e" containerName="neutron-api" Dec 01 20:18:55 crc kubenswrapper[4802]: E1201 20:18:55.816287 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b30d2288-725d-4cdf-9297-ad9afee36691" containerName="proxy-httpd" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.816295 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="b30d2288-725d-4cdf-9297-ad9afee36691" containerName="proxy-httpd" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.816549 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="b30d2288-725d-4cdf-9297-ad9afee36691" containerName="ceilometer-notification-agent" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.816569 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="b30d2288-725d-4cdf-9297-ad9afee36691" containerName="proxy-httpd" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.816625 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="d67137e5-b9ea-4ca0-8851-a7f041ad745e" containerName="neutron-httpd" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.816632 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="d67137e5-b9ea-4ca0-8851-a7f041ad745e" containerName="neutron-api" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.816645 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="b30d2288-725d-4cdf-9297-ad9afee36691" containerName="sg-core" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.816656 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="b30d2288-725d-4cdf-9297-ad9afee36691" containerName="ceilometer-central-agent" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.819629 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.832857 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.833071 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.835211 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.859892 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-2kqwl"] Dec 01 20:18:55 crc kubenswrapper[4802]: W1201 20:18:55.863390 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefc9020e_4f69_49c4_9788_aa84f98d8c77.slice/crio-b61abdc29f47f7346fcfb6b19cf4c2e7513049d57d65eaf200103406ad389868 WatchSource:0}: Error finding container b61abdc29f47f7346fcfb6b19cf4c2e7513049d57d65eaf200103406ad389868: Status 404 returned error can't find the container with id b61abdc29f47f7346fcfb6b19cf4c2e7513049d57d65eaf200103406ad389868 Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.868086 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-xf6km"] Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.884298 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3c20-account-create-update-frnjt"] Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.891459 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8049ca74-7d3d-4532-8572-d5581f5d62f2-config-data\") pod \"ceilometer-0\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " pod="openstack/ceilometer-0" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.891643 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8049ca74-7d3d-4532-8572-d5581f5d62f2-scripts\") pod \"ceilometer-0\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " pod="openstack/ceilometer-0" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.891735 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8049ca74-7d3d-4532-8572-d5581f5d62f2-run-httpd\") pod \"ceilometer-0\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " pod="openstack/ceilometer-0" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.891893 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8049ca74-7d3d-4532-8572-d5581f5d62f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " pod="openstack/ceilometer-0" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.892362 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-aa02-account-create-update-tf6q4"] Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.900452 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8049ca74-7d3d-4532-8572-d5581f5d62f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " pod="openstack/ceilometer-0" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.900630 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8049ca74-7d3d-4532-8572-d5581f5d62f2-log-httpd\") pod \"ceilometer-0\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " pod="openstack/ceilometer-0" Dec 01 20:18:55 crc kubenswrapper[4802]: I1201 20:18:55.903342 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcs4s\" (UniqueName: \"kubernetes.io/projected/8049ca74-7d3d-4532-8572-d5581f5d62f2-kube-api-access-qcs4s\") pod \"ceilometer-0\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " pod="openstack/ceilometer-0" Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.005366 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8049ca74-7d3d-4532-8572-d5581f5d62f2-scripts\") pod \"ceilometer-0\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " pod="openstack/ceilometer-0" Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.005493 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8049ca74-7d3d-4532-8572-d5581f5d62f2-run-httpd\") pod \"ceilometer-0\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " pod="openstack/ceilometer-0" Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.005589 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8049ca74-7d3d-4532-8572-d5581f5d62f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " pod="openstack/ceilometer-0" Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.005623 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8049ca74-7d3d-4532-8572-d5581f5d62f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " pod="openstack/ceilometer-0" Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.005670 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8049ca74-7d3d-4532-8572-d5581f5d62f2-log-httpd\") pod \"ceilometer-0\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " pod="openstack/ceilometer-0" Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.005704 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcs4s\" (UniqueName: \"kubernetes.io/projected/8049ca74-7d3d-4532-8572-d5581f5d62f2-kube-api-access-qcs4s\") pod \"ceilometer-0\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " pod="openstack/ceilometer-0" Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.005746 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8049ca74-7d3d-4532-8572-d5581f5d62f2-config-data\") pod \"ceilometer-0\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " pod="openstack/ceilometer-0" Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.011288 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8049ca74-7d3d-4532-8572-d5581f5d62f2-run-httpd\") pod \"ceilometer-0\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " pod="openstack/ceilometer-0" Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.011524 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8049ca74-7d3d-4532-8572-d5581f5d62f2-log-httpd\") pod \"ceilometer-0\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " pod="openstack/ceilometer-0" Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.016351 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8049ca74-7d3d-4532-8572-d5581f5d62f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " pod="openstack/ceilometer-0" Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.017998 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8049ca74-7d3d-4532-8572-d5581f5d62f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " pod="openstack/ceilometer-0" Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.022236 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8049ca74-7d3d-4532-8572-d5581f5d62f2-scripts\") pod \"ceilometer-0\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " pod="openstack/ceilometer-0" Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.024036 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8049ca74-7d3d-4532-8572-d5581f5d62f2-config-data\") pod \"ceilometer-0\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " pod="openstack/ceilometer-0" Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.040962 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcs4s\" (UniqueName: \"kubernetes.io/projected/8049ca74-7d3d-4532-8572-d5581f5d62f2-kube-api-access-qcs4s\") pod \"ceilometer-0\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " pod="openstack/ceilometer-0" Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.122692 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.260278 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.414071 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5967795cb6-dtwwt" event={"ID":"d67137e5-b9ea-4ca0-8851-a7f041ad745e","Type":"ContainerDied","Data":"066f87775eb6b49666ecfe8e20bea315a8d035ac18558352736c87ba9c1b5bdf"} Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.414147 4802 scope.go:117] "RemoveContainer" containerID="bb171adba622a509decad3632fd36cd6a4d820670ea158a5d1f714bc72223b49" Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.414491 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5967795cb6-dtwwt" Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.435319 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ee5c-account-create-update-xw4zq" event={"ID":"fc3b62ea-1f25-4221-b09f-5a915164fa80","Type":"ContainerStarted","Data":"9e7d7b9c7371b813189ed90f7f50dd2604410c6435ea5d2c30fc4a0a5501a80d"} Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.435664 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ee5c-account-create-update-xw4zq" event={"ID":"fc3b62ea-1f25-4221-b09f-5a915164fa80","Type":"ContainerStarted","Data":"afa80b95b740fba45b5f8daf1fba93796eea497b9fdc6ba955db8bb24c733db7"} Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.452006 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-xf6km" event={"ID":"a13d5208-da5b-4c02-84e5-871547bbafbf","Type":"ContainerStarted","Data":"989dd14f44779cadab7b021d956f743850af71d135a9136601e559fbba569b63"} Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.452097 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-xf6km" event={"ID":"a13d5208-da5b-4c02-84e5-871547bbafbf","Type":"ContainerStarted","Data":"28ab1c52fdbfba31d8a056ffbd9569ab651346f7712e885fa35048b7a10e46c5"} Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.466569 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qplvx" event={"ID":"c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e","Type":"ContainerStarted","Data":"40026d4792378d5240c3f320f5f8d901bda39fbae51edd49b253b8675229f89c"} Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.466631 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qplvx" event={"ID":"c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e","Type":"ContainerStarted","Data":"d07853264e1d7e312941c7f193b1155c508ad2b19bdd593688d5ffdc8aa3d94e"} Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.469148 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-ee5c-account-create-update-xw4zq" podStartSLOduration=4.469120112 podStartE2EDuration="4.469120112s" podCreationTimestamp="2025-12-01 20:18:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:18:56.454708999 +0000 UTC m=+1358.017268640" watchObservedRunningTime="2025-12-01 20:18:56.469120112 +0000 UTC m=+1358.031679753" Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.483969 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-2kqwl" event={"ID":"b26bcb68-81a2-43fa-b81f-29dbc8ca213e","Type":"ContainerStarted","Data":"9a72b4640dc57e0f397acdcae94effa644ef7c6f13439da636080dfd29401d8b"} Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.484043 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-2kqwl" event={"ID":"b26bcb68-81a2-43fa-b81f-29dbc8ca213e","Type":"ContainerStarted","Data":"1de0977a3db16cb9b0bcac2956cf56dd7bdef92f58d6eb51c17eeb48ddb17a9a"} Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.486346 4802 scope.go:117] "RemoveContainer" containerID="a00a0d7991d7381f3b22c0ca3b8cccab6dd6c98f0dab5c53cf683ed0aebf5d75" Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.500970 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-xf6km" podStartSLOduration=4.500937771 podStartE2EDuration="4.500937771s" podCreationTimestamp="2025-12-01 20:18:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:18:56.476929427 +0000 UTC m=+1358.039489068" watchObservedRunningTime="2025-12-01 20:18:56.500937771 +0000 UTC m=+1358.063497412" Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.503058 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-aa02-account-create-update-tf6q4" event={"ID":"efc9020e-4f69-49c4-9788-aa84f98d8c77","Type":"ContainerStarted","Data":"4f2dbefc833f7007d51200a196e4c47c2805ddacb4d255b9f0013ee4abd07172"} Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.503104 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-aa02-account-create-update-tf6q4" event={"ID":"efc9020e-4f69-49c4-9788-aa84f98d8c77","Type":"ContainerStarted","Data":"b61abdc29f47f7346fcfb6b19cf4c2e7513049d57d65eaf200103406ad389868"} Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.533869 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-2kqwl" podStartSLOduration=4.533844884 podStartE2EDuration="4.533844884s" podCreationTimestamp="2025-12-01 20:18:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:18:56.526897695 +0000 UTC m=+1358.089457336" watchObservedRunningTime="2025-12-01 20:18:56.533844884 +0000 UTC m=+1358.096404515" Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.575974 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3c20-account-create-update-frnjt" event={"ID":"53cf7526-8d6a-4789-8582-854087ec7b2b","Type":"ContainerStarted","Data":"6bea7ec021452b58fc7fec289da0c417cb45f371401c8b3cf7ed26ab7e516354"} Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.576044 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3c20-account-create-update-frnjt" event={"ID":"53cf7526-8d6a-4789-8582-854087ec7b2b","Type":"ContainerStarted","Data":"d7dd12452a1e52b40a040b4c9b003a3cbdcbbe851f95afb63fbca774449a8fbc"} Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.577532 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5967795cb6-dtwwt"] Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.588746 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5967795cb6-dtwwt"] Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.605884 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-aa02-account-create-update-tf6q4" podStartSLOduration=4.605851624 podStartE2EDuration="4.605851624s" podCreationTimestamp="2025-12-01 20:18:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:18:56.574424268 +0000 UTC m=+1358.136983909" watchObservedRunningTime="2025-12-01 20:18:56.605851624 +0000 UTC m=+1358.168411265" Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.607777 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-3c20-account-create-update-frnjt" podStartSLOduration=4.607770875 podStartE2EDuration="4.607770875s" podCreationTimestamp="2025-12-01 20:18:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:18:56.590287246 +0000 UTC m=+1358.152846887" watchObservedRunningTime="2025-12-01 20:18:56.607770875 +0000 UTC m=+1358.170330516" Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.742099 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b30d2288-725d-4cdf-9297-ad9afee36691" path="/var/lib/kubelet/pods/b30d2288-725d-4cdf-9297-ad9afee36691/volumes" Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.743012 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d67137e5-b9ea-4ca0-8851-a7f041ad745e" path="/var/lib/kubelet/pods/d67137e5-b9ea-4ca0-8851-a7f041ad745e/volumes" Dec 01 20:18:56 crc kubenswrapper[4802]: I1201 20:18:56.908514 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:18:57 crc kubenswrapper[4802]: I1201 20:18:57.595855 4802 generic.go:334] "Generic (PLEG): container finished" podID="fc3b62ea-1f25-4221-b09f-5a915164fa80" containerID="9e7d7b9c7371b813189ed90f7f50dd2604410c6435ea5d2c30fc4a0a5501a80d" exitCode=0 Dec 01 20:18:57 crc kubenswrapper[4802]: I1201 20:18:57.595950 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ee5c-account-create-update-xw4zq" event={"ID":"fc3b62ea-1f25-4221-b09f-5a915164fa80","Type":"ContainerDied","Data":"9e7d7b9c7371b813189ed90f7f50dd2604410c6435ea5d2c30fc4a0a5501a80d"} Dec 01 20:18:57 crc kubenswrapper[4802]: I1201 20:18:57.599441 4802 generic.go:334] "Generic (PLEG): container finished" podID="a13d5208-da5b-4c02-84e5-871547bbafbf" containerID="989dd14f44779cadab7b021d956f743850af71d135a9136601e559fbba569b63" exitCode=0 Dec 01 20:18:57 crc kubenswrapper[4802]: I1201 20:18:57.599529 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-xf6km" event={"ID":"a13d5208-da5b-4c02-84e5-871547bbafbf","Type":"ContainerDied","Data":"989dd14f44779cadab7b021d956f743850af71d135a9136601e559fbba569b63"} Dec 01 20:18:57 crc kubenswrapper[4802]: I1201 20:18:57.601890 4802 generic.go:334] "Generic (PLEG): container finished" podID="c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e" containerID="40026d4792378d5240c3f320f5f8d901bda39fbae51edd49b253b8675229f89c" exitCode=0 Dec 01 20:18:57 crc kubenswrapper[4802]: I1201 20:18:57.602012 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qplvx" event={"ID":"c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e","Type":"ContainerDied","Data":"40026d4792378d5240c3f320f5f8d901bda39fbae51edd49b253b8675229f89c"} Dec 01 20:18:57 crc kubenswrapper[4802]: I1201 20:18:57.610182 4802 generic.go:334] "Generic (PLEG): container finished" podID="b26bcb68-81a2-43fa-b81f-29dbc8ca213e" containerID="9a72b4640dc57e0f397acdcae94effa644ef7c6f13439da636080dfd29401d8b" exitCode=0 Dec 01 20:18:57 crc kubenswrapper[4802]: I1201 20:18:57.610288 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-2kqwl" event={"ID":"b26bcb68-81a2-43fa-b81f-29dbc8ca213e","Type":"ContainerDied","Data":"9a72b4640dc57e0f397acdcae94effa644ef7c6f13439da636080dfd29401d8b"} Dec 01 20:18:57 crc kubenswrapper[4802]: I1201 20:18:57.613994 4802 generic.go:334] "Generic (PLEG): container finished" podID="efc9020e-4f69-49c4-9788-aa84f98d8c77" containerID="4f2dbefc833f7007d51200a196e4c47c2805ddacb4d255b9f0013ee4abd07172" exitCode=0 Dec 01 20:18:57 crc kubenswrapper[4802]: I1201 20:18:57.614039 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-aa02-account-create-update-tf6q4" event={"ID":"efc9020e-4f69-49c4-9788-aa84f98d8c77","Type":"ContainerDied","Data":"4f2dbefc833f7007d51200a196e4c47c2805ddacb4d255b9f0013ee4abd07172"} Dec 01 20:18:57 crc kubenswrapper[4802]: I1201 20:18:57.623444 4802 generic.go:334] "Generic (PLEG): container finished" podID="53cf7526-8d6a-4789-8582-854087ec7b2b" containerID="6bea7ec021452b58fc7fec289da0c417cb45f371401c8b3cf7ed26ab7e516354" exitCode=0 Dec 01 20:18:57 crc kubenswrapper[4802]: I1201 20:18:57.623496 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3c20-account-create-update-frnjt" event={"ID":"53cf7526-8d6a-4789-8582-854087ec7b2b","Type":"ContainerDied","Data":"6bea7ec021452b58fc7fec289da0c417cb45f371401c8b3cf7ed26ab7e516354"} Dec 01 20:18:57 crc kubenswrapper[4802]: I1201 20:18:57.635425 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8049ca74-7d3d-4532-8572-d5581f5d62f2","Type":"ContainerStarted","Data":"34651d7cc5d39c692f76d6947f5053908df0eedaab8e4d221f332858629e511f"} Dec 01 20:18:58 crc kubenswrapper[4802]: I1201 20:18:58.062522 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qplvx" Dec 01 20:18:58 crc kubenswrapper[4802]: I1201 20:18:58.088551 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:18:58 crc kubenswrapper[4802]: I1201 20:18:58.088641 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:18:58 crc kubenswrapper[4802]: I1201 20:18:58.088705 4802 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 20:18:58 crc kubenswrapper[4802]: I1201 20:18:58.089738 4802 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"64791094ee9b4b30a04bff1aa7e941faf215ab40eb68a1d66d92dede04b50331"} pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 20:18:58 crc kubenswrapper[4802]: I1201 20:18:58.089862 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" containerID="cri-o://64791094ee9b4b30a04bff1aa7e941faf215ab40eb68a1d66d92dede04b50331" gracePeriod=600 Dec 01 20:18:58 crc kubenswrapper[4802]: I1201 20:18:58.192853 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpsh7\" (UniqueName: \"kubernetes.io/projected/c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e-kube-api-access-gpsh7\") pod \"c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e\" (UID: \"c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e\") " Dec 01 20:18:58 crc kubenswrapper[4802]: I1201 20:18:58.193010 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e-operator-scripts\") pod \"c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e\" (UID: \"c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e\") " Dec 01 20:18:58 crc kubenswrapper[4802]: I1201 20:18:58.193774 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e" (UID: "c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:18:58 crc kubenswrapper[4802]: I1201 20:18:58.201217 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e-kube-api-access-gpsh7" (OuterVolumeSpecName: "kube-api-access-gpsh7") pod "c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e" (UID: "c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e"). InnerVolumeSpecName "kube-api-access-gpsh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:18:58 crc kubenswrapper[4802]: I1201 20:18:58.242276 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:18:58 crc kubenswrapper[4802]: I1201 20:18:58.295827 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpsh7\" (UniqueName: \"kubernetes.io/projected/c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e-kube-api-access-gpsh7\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:58 crc kubenswrapper[4802]: I1201 20:18:58.295862 4802 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:58 crc kubenswrapper[4802]: I1201 20:18:58.666478 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8049ca74-7d3d-4532-8572-d5581f5d62f2","Type":"ContainerStarted","Data":"e21c884a589594982f3510cb07d0197280502395df387f0452d38427ea337032"} Dec 01 20:18:58 crc kubenswrapper[4802]: I1201 20:18:58.666936 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8049ca74-7d3d-4532-8572-d5581f5d62f2","Type":"ContainerStarted","Data":"991e84fd2c1baf37384d189746a8621ea9fe95835eb39ba3ae9df461c1ce1c7f"} Dec 01 20:18:58 crc kubenswrapper[4802]: I1201 20:18:58.673160 4802 generic.go:334] "Generic (PLEG): container finished" podID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerID="64791094ee9b4b30a04bff1aa7e941faf215ab40eb68a1d66d92dede04b50331" exitCode=0 Dec 01 20:18:58 crc kubenswrapper[4802]: I1201 20:18:58.673226 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerDied","Data":"64791094ee9b4b30a04bff1aa7e941faf215ab40eb68a1d66d92dede04b50331"} Dec 01 20:18:58 crc kubenswrapper[4802]: I1201 20:18:58.673294 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerStarted","Data":"ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619"} Dec 01 20:18:58 crc kubenswrapper[4802]: I1201 20:18:58.673319 4802 scope.go:117] "RemoveContainer" containerID="cdab3d21e9b678aa33ac2623e4e7233c7b5184d7898d89b02a2bb479a6f32dd9" Dec 01 20:18:58 crc kubenswrapper[4802]: I1201 20:18:58.678486 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qplvx" Dec 01 20:18:58 crc kubenswrapper[4802]: I1201 20:18:58.678533 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qplvx" event={"ID":"c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e","Type":"ContainerDied","Data":"d07853264e1d7e312941c7f193b1155c508ad2b19bdd593688d5ffdc8aa3d94e"} Dec 01 20:18:58 crc kubenswrapper[4802]: I1201 20:18:58.678571 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d07853264e1d7e312941c7f193b1155c508ad2b19bdd593688d5ffdc8aa3d94e" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.075859 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3c20-account-create-update-frnjt" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.233757 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxh26\" (UniqueName: \"kubernetes.io/projected/53cf7526-8d6a-4789-8582-854087ec7b2b-kube-api-access-gxh26\") pod \"53cf7526-8d6a-4789-8582-854087ec7b2b\" (UID: \"53cf7526-8d6a-4789-8582-854087ec7b2b\") " Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.234224 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53cf7526-8d6a-4789-8582-854087ec7b2b-operator-scripts\") pod \"53cf7526-8d6a-4789-8582-854087ec7b2b\" (UID: \"53cf7526-8d6a-4789-8582-854087ec7b2b\") " Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.235683 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53cf7526-8d6a-4789-8582-854087ec7b2b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "53cf7526-8d6a-4789-8582-854087ec7b2b" (UID: "53cf7526-8d6a-4789-8582-854087ec7b2b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.241448 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53cf7526-8d6a-4789-8582-854087ec7b2b-kube-api-access-gxh26" (OuterVolumeSpecName: "kube-api-access-gxh26") pod "53cf7526-8d6a-4789-8582-854087ec7b2b" (UID: "53cf7526-8d6a-4789-8582-854087ec7b2b"). InnerVolumeSpecName "kube-api-access-gxh26". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.336424 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxh26\" (UniqueName: \"kubernetes.io/projected/53cf7526-8d6a-4789-8582-854087ec7b2b-kube-api-access-gxh26\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.336449 4802 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53cf7526-8d6a-4789-8582-854087ec7b2b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.419802 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-aa02-account-create-update-tf6q4" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.421322 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2kqwl" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.426398 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ee5c-account-create-update-xw4zq" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.440694 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-xf6km" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.538389 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48rj2\" (UniqueName: \"kubernetes.io/projected/a13d5208-da5b-4c02-84e5-871547bbafbf-kube-api-access-48rj2\") pod \"a13d5208-da5b-4c02-84e5-871547bbafbf\" (UID: \"a13d5208-da5b-4c02-84e5-871547bbafbf\") " Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.538708 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzgjp\" (UniqueName: \"kubernetes.io/projected/fc3b62ea-1f25-4221-b09f-5a915164fa80-kube-api-access-dzgjp\") pod \"fc3b62ea-1f25-4221-b09f-5a915164fa80\" (UID: \"fc3b62ea-1f25-4221-b09f-5a915164fa80\") " Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.538767 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a13d5208-da5b-4c02-84e5-871547bbafbf-operator-scripts\") pod \"a13d5208-da5b-4c02-84e5-871547bbafbf\" (UID: \"a13d5208-da5b-4c02-84e5-871547bbafbf\") " Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.538855 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc3b62ea-1f25-4221-b09f-5a915164fa80-operator-scripts\") pod \"fc3b62ea-1f25-4221-b09f-5a915164fa80\" (UID: \"fc3b62ea-1f25-4221-b09f-5a915164fa80\") " Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.538972 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efc9020e-4f69-49c4-9788-aa84f98d8c77-operator-scripts\") pod \"efc9020e-4f69-49c4-9788-aa84f98d8c77\" (UID: \"efc9020e-4f69-49c4-9788-aa84f98d8c77\") " Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.539047 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86p6v\" (UniqueName: \"kubernetes.io/projected/efc9020e-4f69-49c4-9788-aa84f98d8c77-kube-api-access-86p6v\") pod \"efc9020e-4f69-49c4-9788-aa84f98d8c77\" (UID: \"efc9020e-4f69-49c4-9788-aa84f98d8c77\") " Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.539089 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b26bcb68-81a2-43fa-b81f-29dbc8ca213e-operator-scripts\") pod \"b26bcb68-81a2-43fa-b81f-29dbc8ca213e\" (UID: \"b26bcb68-81a2-43fa-b81f-29dbc8ca213e\") " Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.539113 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbj6f\" (UniqueName: \"kubernetes.io/projected/b26bcb68-81a2-43fa-b81f-29dbc8ca213e-kube-api-access-xbj6f\") pod \"b26bcb68-81a2-43fa-b81f-29dbc8ca213e\" (UID: \"b26bcb68-81a2-43fa-b81f-29dbc8ca213e\") " Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.541401 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc3b62ea-1f25-4221-b09f-5a915164fa80-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fc3b62ea-1f25-4221-b09f-5a915164fa80" (UID: "fc3b62ea-1f25-4221-b09f-5a915164fa80"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.542507 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b26bcb68-81a2-43fa-b81f-29dbc8ca213e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b26bcb68-81a2-43fa-b81f-29dbc8ca213e" (UID: "b26bcb68-81a2-43fa-b81f-29dbc8ca213e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.543101 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efc9020e-4f69-49c4-9788-aa84f98d8c77-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "efc9020e-4f69-49c4-9788-aa84f98d8c77" (UID: "efc9020e-4f69-49c4-9788-aa84f98d8c77"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.543633 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b26bcb68-81a2-43fa-b81f-29dbc8ca213e-kube-api-access-xbj6f" (OuterVolumeSpecName: "kube-api-access-xbj6f") pod "b26bcb68-81a2-43fa-b81f-29dbc8ca213e" (UID: "b26bcb68-81a2-43fa-b81f-29dbc8ca213e"). InnerVolumeSpecName "kube-api-access-xbj6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.543776 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a13d5208-da5b-4c02-84e5-871547bbafbf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a13d5208-da5b-4c02-84e5-871547bbafbf" (UID: "a13d5208-da5b-4c02-84e5-871547bbafbf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.545620 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a13d5208-da5b-4c02-84e5-871547bbafbf-kube-api-access-48rj2" (OuterVolumeSpecName: "kube-api-access-48rj2") pod "a13d5208-da5b-4c02-84e5-871547bbafbf" (UID: "a13d5208-da5b-4c02-84e5-871547bbafbf"). InnerVolumeSpecName "kube-api-access-48rj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.546285 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc3b62ea-1f25-4221-b09f-5a915164fa80-kube-api-access-dzgjp" (OuterVolumeSpecName: "kube-api-access-dzgjp") pod "fc3b62ea-1f25-4221-b09f-5a915164fa80" (UID: "fc3b62ea-1f25-4221-b09f-5a915164fa80"). InnerVolumeSpecName "kube-api-access-dzgjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.549383 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efc9020e-4f69-49c4-9788-aa84f98d8c77-kube-api-access-86p6v" (OuterVolumeSpecName: "kube-api-access-86p6v") pod "efc9020e-4f69-49c4-9788-aa84f98d8c77" (UID: "efc9020e-4f69-49c4-9788-aa84f98d8c77"). InnerVolumeSpecName "kube-api-access-86p6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.641835 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86p6v\" (UniqueName: \"kubernetes.io/projected/efc9020e-4f69-49c4-9788-aa84f98d8c77-kube-api-access-86p6v\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.641893 4802 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b26bcb68-81a2-43fa-b81f-29dbc8ca213e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.641911 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbj6f\" (UniqueName: \"kubernetes.io/projected/b26bcb68-81a2-43fa-b81f-29dbc8ca213e-kube-api-access-xbj6f\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.641924 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48rj2\" (UniqueName: \"kubernetes.io/projected/a13d5208-da5b-4c02-84e5-871547bbafbf-kube-api-access-48rj2\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.641937 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzgjp\" (UniqueName: \"kubernetes.io/projected/fc3b62ea-1f25-4221-b09f-5a915164fa80-kube-api-access-dzgjp\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.641950 4802 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a13d5208-da5b-4c02-84e5-871547bbafbf-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.641960 4802 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc3b62ea-1f25-4221-b09f-5a915164fa80-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.641978 4802 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efc9020e-4f69-49c4-9788-aa84f98d8c77-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.693278 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ee5c-account-create-update-xw4zq" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.693279 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ee5c-account-create-update-xw4zq" event={"ID":"fc3b62ea-1f25-4221-b09f-5a915164fa80","Type":"ContainerDied","Data":"afa80b95b740fba45b5f8daf1fba93796eea497b9fdc6ba955db8bb24c733db7"} Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.694939 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afa80b95b740fba45b5f8daf1fba93796eea497b9fdc6ba955db8bb24c733db7" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.696309 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-xf6km" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.696348 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-xf6km" event={"ID":"a13d5208-da5b-4c02-84e5-871547bbafbf","Type":"ContainerDied","Data":"28ab1c52fdbfba31d8a056ffbd9569ab651346f7712e885fa35048b7a10e46c5"} Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.696407 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28ab1c52fdbfba31d8a056ffbd9569ab651346f7712e885fa35048b7a10e46c5" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.698400 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2kqwl" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.698399 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-2kqwl" event={"ID":"b26bcb68-81a2-43fa-b81f-29dbc8ca213e","Type":"ContainerDied","Data":"1de0977a3db16cb9b0bcac2956cf56dd7bdef92f58d6eb51c17eeb48ddb17a9a"} Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.698524 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1de0977a3db16cb9b0bcac2956cf56dd7bdef92f58d6eb51c17eeb48ddb17a9a" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.699952 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-aa02-account-create-update-tf6q4" event={"ID":"efc9020e-4f69-49c4-9788-aa84f98d8c77","Type":"ContainerDied","Data":"b61abdc29f47f7346fcfb6b19cf4c2e7513049d57d65eaf200103406ad389868"} Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.699980 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b61abdc29f47f7346fcfb6b19cf4c2e7513049d57d65eaf200103406ad389868" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.699956 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-aa02-account-create-update-tf6q4" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.702082 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3c20-account-create-update-frnjt" event={"ID":"53cf7526-8d6a-4789-8582-854087ec7b2b","Type":"ContainerDied","Data":"d7dd12452a1e52b40a040b4c9b003a3cbdcbbe851f95afb63fbca774449a8fbc"} Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.702118 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7dd12452a1e52b40a040b4c9b003a3cbdcbbe851f95afb63fbca774449a8fbc" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.702303 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3c20-account-create-update-frnjt" Dec 01 20:18:59 crc kubenswrapper[4802]: I1201 20:18:59.705303 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8049ca74-7d3d-4532-8572-d5581f5d62f2","Type":"ContainerStarted","Data":"6f3f98445da87c705cadba915716c356eda7725654d00a122e8493a599d8da1a"} Dec 01 20:19:01 crc kubenswrapper[4802]: I1201 20:19:01.730238 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8049ca74-7d3d-4532-8572-d5581f5d62f2","Type":"ContainerStarted","Data":"5f7824e08943e4ab793719b6939e37c940fb92f52ac57b7a5a8fb9ac5d2b66c0"} Dec 01 20:19:01 crc kubenswrapper[4802]: I1201 20:19:01.730396 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8049ca74-7d3d-4532-8572-d5581f5d62f2" containerName="ceilometer-central-agent" containerID="cri-o://991e84fd2c1baf37384d189746a8621ea9fe95835eb39ba3ae9df461c1ce1c7f" gracePeriod=30 Dec 01 20:19:01 crc kubenswrapper[4802]: I1201 20:19:01.730467 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8049ca74-7d3d-4532-8572-d5581f5d62f2" containerName="ceilometer-notification-agent" containerID="cri-o://e21c884a589594982f3510cb07d0197280502395df387f0452d38427ea337032" gracePeriod=30 Dec 01 20:19:01 crc kubenswrapper[4802]: I1201 20:19:01.730482 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8049ca74-7d3d-4532-8572-d5581f5d62f2" containerName="proxy-httpd" containerID="cri-o://5f7824e08943e4ab793719b6939e37c940fb92f52ac57b7a5a8fb9ac5d2b66c0" gracePeriod=30 Dec 01 20:19:01 crc kubenswrapper[4802]: I1201 20:19:01.730549 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8049ca74-7d3d-4532-8572-d5581f5d62f2" containerName="sg-core" containerID="cri-o://6f3f98445da87c705cadba915716c356eda7725654d00a122e8493a599d8da1a" gracePeriod=30 Dec 01 20:19:01 crc kubenswrapper[4802]: I1201 20:19:01.731304 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 20:19:01 crc kubenswrapper[4802]: I1201 20:19:01.772939 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.903977797 podStartE2EDuration="6.772911716s" podCreationTimestamp="2025-12-01 20:18:55 +0000 UTC" firstStartedPulling="2025-12-01 20:18:56.929439643 +0000 UTC m=+1358.491999284" lastFinishedPulling="2025-12-01 20:19:00.798373562 +0000 UTC m=+1362.360933203" observedRunningTime="2025-12-01 20:19:01.762090666 +0000 UTC m=+1363.324650327" watchObservedRunningTime="2025-12-01 20:19:01.772911716 +0000 UTC m=+1363.335471357" Dec 01 20:19:02 crc kubenswrapper[4802]: I1201 20:19:02.745509 4802 generic.go:334] "Generic (PLEG): container finished" podID="8049ca74-7d3d-4532-8572-d5581f5d62f2" containerID="5f7824e08943e4ab793719b6939e37c940fb92f52ac57b7a5a8fb9ac5d2b66c0" exitCode=0 Dec 01 20:19:02 crc kubenswrapper[4802]: I1201 20:19:02.746002 4802 generic.go:334] "Generic (PLEG): container finished" podID="8049ca74-7d3d-4532-8572-d5581f5d62f2" containerID="6f3f98445da87c705cadba915716c356eda7725654d00a122e8493a599d8da1a" exitCode=2 Dec 01 20:19:02 crc kubenswrapper[4802]: I1201 20:19:02.746020 4802 generic.go:334] "Generic (PLEG): container finished" podID="8049ca74-7d3d-4532-8572-d5581f5d62f2" containerID="e21c884a589594982f3510cb07d0197280502395df387f0452d38427ea337032" exitCode=0 Dec 01 20:19:02 crc kubenswrapper[4802]: I1201 20:19:02.746052 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8049ca74-7d3d-4532-8572-d5581f5d62f2","Type":"ContainerDied","Data":"5f7824e08943e4ab793719b6939e37c940fb92f52ac57b7a5a8fb9ac5d2b66c0"} Dec 01 20:19:02 crc kubenswrapper[4802]: I1201 20:19:02.746092 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8049ca74-7d3d-4532-8572-d5581f5d62f2","Type":"ContainerDied","Data":"6f3f98445da87c705cadba915716c356eda7725654d00a122e8493a599d8da1a"} Dec 01 20:19:02 crc kubenswrapper[4802]: I1201 20:19:02.746109 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8049ca74-7d3d-4532-8572-d5581f5d62f2","Type":"ContainerDied","Data":"e21c884a589594982f3510cb07d0197280502395df387f0452d38427ea337032"} Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.234231 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-r7wmv"] Dec 01 20:19:03 crc kubenswrapper[4802]: E1201 20:19:03.235095 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc3b62ea-1f25-4221-b09f-5a915164fa80" containerName="mariadb-account-create-update" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.235125 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc3b62ea-1f25-4221-b09f-5a915164fa80" containerName="mariadb-account-create-update" Dec 01 20:19:03 crc kubenswrapper[4802]: E1201 20:19:03.235148 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efc9020e-4f69-49c4-9788-aa84f98d8c77" containerName="mariadb-account-create-update" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.235156 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="efc9020e-4f69-49c4-9788-aa84f98d8c77" containerName="mariadb-account-create-update" Dec 01 20:19:03 crc kubenswrapper[4802]: E1201 20:19:03.235173 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b26bcb68-81a2-43fa-b81f-29dbc8ca213e" containerName="mariadb-database-create" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.235182 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="b26bcb68-81a2-43fa-b81f-29dbc8ca213e" containerName="mariadb-database-create" Dec 01 20:19:03 crc kubenswrapper[4802]: E1201 20:19:03.235220 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53cf7526-8d6a-4789-8582-854087ec7b2b" containerName="mariadb-account-create-update" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.235230 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="53cf7526-8d6a-4789-8582-854087ec7b2b" containerName="mariadb-account-create-update" Dec 01 20:19:03 crc kubenswrapper[4802]: E1201 20:19:03.235247 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13d5208-da5b-4c02-84e5-871547bbafbf" containerName="mariadb-database-create" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.235254 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13d5208-da5b-4c02-84e5-871547bbafbf" containerName="mariadb-database-create" Dec 01 20:19:03 crc kubenswrapper[4802]: E1201 20:19:03.235268 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e" containerName="mariadb-database-create" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.235276 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e" containerName="mariadb-database-create" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.235489 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="53cf7526-8d6a-4789-8582-854087ec7b2b" containerName="mariadb-account-create-update" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.235509 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13d5208-da5b-4c02-84e5-871547bbafbf" containerName="mariadb-database-create" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.235526 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc3b62ea-1f25-4221-b09f-5a915164fa80" containerName="mariadb-account-create-update" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.235539 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="efc9020e-4f69-49c4-9788-aa84f98d8c77" containerName="mariadb-account-create-update" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.235550 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="b26bcb68-81a2-43fa-b81f-29dbc8ca213e" containerName="mariadb-database-create" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.235559 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e" containerName="mariadb-database-create" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.236406 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-r7wmv" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.239062 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-k47dl" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.240642 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.242838 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-r7wmv"] Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.244537 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.331249 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc32fb72-22ab-49e4-89b3-bb5aaf4c3456-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-r7wmv\" (UID: \"fc32fb72-22ab-49e4-89b3-bb5aaf4c3456\") " pod="openstack/nova-cell0-conductor-db-sync-r7wmv" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.331329 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc32fb72-22ab-49e4-89b3-bb5aaf4c3456-scripts\") pod \"nova-cell0-conductor-db-sync-r7wmv\" (UID: \"fc32fb72-22ab-49e4-89b3-bb5aaf4c3456\") " pod="openstack/nova-cell0-conductor-db-sync-r7wmv" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.331411 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc32fb72-22ab-49e4-89b3-bb5aaf4c3456-config-data\") pod \"nova-cell0-conductor-db-sync-r7wmv\" (UID: \"fc32fb72-22ab-49e4-89b3-bb5aaf4c3456\") " pod="openstack/nova-cell0-conductor-db-sync-r7wmv" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.331451 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwwl6\" (UniqueName: \"kubernetes.io/projected/fc32fb72-22ab-49e4-89b3-bb5aaf4c3456-kube-api-access-bwwl6\") pod \"nova-cell0-conductor-db-sync-r7wmv\" (UID: \"fc32fb72-22ab-49e4-89b3-bb5aaf4c3456\") " pod="openstack/nova-cell0-conductor-db-sync-r7wmv" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.433403 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc32fb72-22ab-49e4-89b3-bb5aaf4c3456-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-r7wmv\" (UID: \"fc32fb72-22ab-49e4-89b3-bb5aaf4c3456\") " pod="openstack/nova-cell0-conductor-db-sync-r7wmv" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.433484 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc32fb72-22ab-49e4-89b3-bb5aaf4c3456-scripts\") pod \"nova-cell0-conductor-db-sync-r7wmv\" (UID: \"fc32fb72-22ab-49e4-89b3-bb5aaf4c3456\") " pod="openstack/nova-cell0-conductor-db-sync-r7wmv" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.433546 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc32fb72-22ab-49e4-89b3-bb5aaf4c3456-config-data\") pod \"nova-cell0-conductor-db-sync-r7wmv\" (UID: \"fc32fb72-22ab-49e4-89b3-bb5aaf4c3456\") " pod="openstack/nova-cell0-conductor-db-sync-r7wmv" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.433584 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwwl6\" (UniqueName: \"kubernetes.io/projected/fc32fb72-22ab-49e4-89b3-bb5aaf4c3456-kube-api-access-bwwl6\") pod \"nova-cell0-conductor-db-sync-r7wmv\" (UID: \"fc32fb72-22ab-49e4-89b3-bb5aaf4c3456\") " pod="openstack/nova-cell0-conductor-db-sync-r7wmv" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.441497 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc32fb72-22ab-49e4-89b3-bb5aaf4c3456-config-data\") pod \"nova-cell0-conductor-db-sync-r7wmv\" (UID: \"fc32fb72-22ab-49e4-89b3-bb5aaf4c3456\") " pod="openstack/nova-cell0-conductor-db-sync-r7wmv" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.444835 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc32fb72-22ab-49e4-89b3-bb5aaf4c3456-scripts\") pod \"nova-cell0-conductor-db-sync-r7wmv\" (UID: \"fc32fb72-22ab-49e4-89b3-bb5aaf4c3456\") " pod="openstack/nova-cell0-conductor-db-sync-r7wmv" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.445251 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc32fb72-22ab-49e4-89b3-bb5aaf4c3456-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-r7wmv\" (UID: \"fc32fb72-22ab-49e4-89b3-bb5aaf4c3456\") " pod="openstack/nova-cell0-conductor-db-sync-r7wmv" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.454462 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwwl6\" (UniqueName: \"kubernetes.io/projected/fc32fb72-22ab-49e4-89b3-bb5aaf4c3456-kube-api-access-bwwl6\") pod \"nova-cell0-conductor-db-sync-r7wmv\" (UID: \"fc32fb72-22ab-49e4-89b3-bb5aaf4c3456\") " pod="openstack/nova-cell0-conductor-db-sync-r7wmv" Dec 01 20:19:03 crc kubenswrapper[4802]: I1201 20:19:03.564970 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-r7wmv" Dec 01 20:19:04 crc kubenswrapper[4802]: I1201 20:19:04.070844 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-r7wmv"] Dec 01 20:19:04 crc kubenswrapper[4802]: I1201 20:19:04.771071 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-r7wmv" event={"ID":"fc32fb72-22ab-49e4-89b3-bb5aaf4c3456","Type":"ContainerStarted","Data":"c4b6cf022d6d89a5ea15774dba29cecf367c83fcb9a3871f2eaae11c8f4d3c3e"} Dec 01 20:19:06 crc kubenswrapper[4802]: I1201 20:19:06.800589 4802 generic.go:334] "Generic (PLEG): container finished" podID="8049ca74-7d3d-4532-8572-d5581f5d62f2" containerID="991e84fd2c1baf37384d189746a8621ea9fe95835eb39ba3ae9df461c1ce1c7f" exitCode=0 Dec 01 20:19:06 crc kubenswrapper[4802]: I1201 20:19:06.800665 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8049ca74-7d3d-4532-8572-d5581f5d62f2","Type":"ContainerDied","Data":"991e84fd2c1baf37384d189746a8621ea9fe95835eb39ba3ae9df461c1ce1c7f"} Dec 01 20:19:06 crc kubenswrapper[4802]: I1201 20:19:06.913318 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.008155 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8049ca74-7d3d-4532-8572-d5581f5d62f2-run-httpd\") pod \"8049ca74-7d3d-4532-8572-d5581f5d62f2\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.008298 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8049ca74-7d3d-4532-8572-d5581f5d62f2-sg-core-conf-yaml\") pod \"8049ca74-7d3d-4532-8572-d5581f5d62f2\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.008390 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8049ca74-7d3d-4532-8572-d5581f5d62f2-combined-ca-bundle\") pod \"8049ca74-7d3d-4532-8572-d5581f5d62f2\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.008414 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcs4s\" (UniqueName: \"kubernetes.io/projected/8049ca74-7d3d-4532-8572-d5581f5d62f2-kube-api-access-qcs4s\") pod \"8049ca74-7d3d-4532-8572-d5581f5d62f2\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.008462 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8049ca74-7d3d-4532-8572-d5581f5d62f2-scripts\") pod \"8049ca74-7d3d-4532-8572-d5581f5d62f2\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.008499 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8049ca74-7d3d-4532-8572-d5581f5d62f2-config-data\") pod \"8049ca74-7d3d-4532-8572-d5581f5d62f2\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.008613 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8049ca74-7d3d-4532-8572-d5581f5d62f2-log-httpd\") pod \"8049ca74-7d3d-4532-8572-d5581f5d62f2\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.008811 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8049ca74-7d3d-4532-8572-d5581f5d62f2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8049ca74-7d3d-4532-8572-d5581f5d62f2" (UID: "8049ca74-7d3d-4532-8572-d5581f5d62f2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.009335 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8049ca74-7d3d-4532-8572-d5581f5d62f2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8049ca74-7d3d-4532-8572-d5581f5d62f2" (UID: "8049ca74-7d3d-4532-8572-d5581f5d62f2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.009435 4802 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8049ca74-7d3d-4532-8572-d5581f5d62f2-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.016599 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8049ca74-7d3d-4532-8572-d5581f5d62f2-scripts" (OuterVolumeSpecName: "scripts") pod "8049ca74-7d3d-4532-8572-d5581f5d62f2" (UID: "8049ca74-7d3d-4532-8572-d5581f5d62f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.016880 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8049ca74-7d3d-4532-8572-d5581f5d62f2-kube-api-access-qcs4s" (OuterVolumeSpecName: "kube-api-access-qcs4s") pod "8049ca74-7d3d-4532-8572-d5581f5d62f2" (UID: "8049ca74-7d3d-4532-8572-d5581f5d62f2"). InnerVolumeSpecName "kube-api-access-qcs4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.043730 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8049ca74-7d3d-4532-8572-d5581f5d62f2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8049ca74-7d3d-4532-8572-d5581f5d62f2" (UID: "8049ca74-7d3d-4532-8572-d5581f5d62f2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.099039 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8049ca74-7d3d-4532-8572-d5581f5d62f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8049ca74-7d3d-4532-8572-d5581f5d62f2" (UID: "8049ca74-7d3d-4532-8572-d5581f5d62f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.111061 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8049ca74-7d3d-4532-8572-d5581f5d62f2-config-data" (OuterVolumeSpecName: "config-data") pod "8049ca74-7d3d-4532-8572-d5581f5d62f2" (UID: "8049ca74-7d3d-4532-8572-d5581f5d62f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.111243 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8049ca74-7d3d-4532-8572-d5581f5d62f2-config-data\") pod \"8049ca74-7d3d-4532-8572-d5581f5d62f2\" (UID: \"8049ca74-7d3d-4532-8572-d5581f5d62f2\") " Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.111793 4802 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8049ca74-7d3d-4532-8572-d5581f5d62f2-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.111825 4802 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8049ca74-7d3d-4532-8572-d5581f5d62f2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.111840 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8049ca74-7d3d-4532-8572-d5581f5d62f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.111854 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcs4s\" (UniqueName: \"kubernetes.io/projected/8049ca74-7d3d-4532-8572-d5581f5d62f2-kube-api-access-qcs4s\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.111865 4802 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8049ca74-7d3d-4532-8572-d5581f5d62f2-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:07 crc kubenswrapper[4802]: W1201 20:19:07.111976 4802 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/8049ca74-7d3d-4532-8572-d5581f5d62f2/volumes/kubernetes.io~secret/config-data Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.111992 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8049ca74-7d3d-4532-8572-d5581f5d62f2-config-data" (OuterVolumeSpecName: "config-data") pod "8049ca74-7d3d-4532-8572-d5581f5d62f2" (UID: "8049ca74-7d3d-4532-8572-d5581f5d62f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.213432 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8049ca74-7d3d-4532-8572-d5581f5d62f2-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.848993 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8049ca74-7d3d-4532-8572-d5581f5d62f2","Type":"ContainerDied","Data":"34651d7cc5d39c692f76d6947f5053908df0eedaab8e4d221f332858629e511f"} Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.849338 4802 scope.go:117] "RemoveContainer" containerID="5f7824e08943e4ab793719b6939e37c940fb92f52ac57b7a5a8fb9ac5d2b66c0" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.849507 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.902543 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.920726 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.936090 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:19:07 crc kubenswrapper[4802]: E1201 20:19:07.936498 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8049ca74-7d3d-4532-8572-d5581f5d62f2" containerName="sg-core" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.936543 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="8049ca74-7d3d-4532-8572-d5581f5d62f2" containerName="sg-core" Dec 01 20:19:07 crc kubenswrapper[4802]: E1201 20:19:07.936558 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8049ca74-7d3d-4532-8572-d5581f5d62f2" containerName="ceilometer-notification-agent" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.936564 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="8049ca74-7d3d-4532-8572-d5581f5d62f2" containerName="ceilometer-notification-agent" Dec 01 20:19:07 crc kubenswrapper[4802]: E1201 20:19:07.936573 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8049ca74-7d3d-4532-8572-d5581f5d62f2" containerName="proxy-httpd" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.936579 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="8049ca74-7d3d-4532-8572-d5581f5d62f2" containerName="proxy-httpd" Dec 01 20:19:07 crc kubenswrapper[4802]: E1201 20:19:07.936606 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8049ca74-7d3d-4532-8572-d5581f5d62f2" containerName="ceilometer-central-agent" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.936613 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="8049ca74-7d3d-4532-8572-d5581f5d62f2" containerName="ceilometer-central-agent" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.936786 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="8049ca74-7d3d-4532-8572-d5581f5d62f2" containerName="proxy-httpd" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.936799 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="8049ca74-7d3d-4532-8572-d5581f5d62f2" containerName="ceilometer-central-agent" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.936809 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="8049ca74-7d3d-4532-8572-d5581f5d62f2" containerName="sg-core" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.936816 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="8049ca74-7d3d-4532-8572-d5581f5d62f2" containerName="ceilometer-notification-agent" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.938305 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.941407 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.942512 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 20:19:07 crc kubenswrapper[4802]: I1201 20:19:07.952355 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:19:08 crc kubenswrapper[4802]: I1201 20:19:08.029658 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8200b146-6478-497b-ac37-d42d4e947a2d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8200b146-6478-497b-ac37-d42d4e947a2d\") " pod="openstack/ceilometer-0" Dec 01 20:19:08 crc kubenswrapper[4802]: I1201 20:19:08.029744 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n8p2\" (UniqueName: \"kubernetes.io/projected/8200b146-6478-497b-ac37-d42d4e947a2d-kube-api-access-6n8p2\") pod \"ceilometer-0\" (UID: \"8200b146-6478-497b-ac37-d42d4e947a2d\") " pod="openstack/ceilometer-0" Dec 01 20:19:08 crc kubenswrapper[4802]: I1201 20:19:08.029820 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8200b146-6478-497b-ac37-d42d4e947a2d-scripts\") pod \"ceilometer-0\" (UID: \"8200b146-6478-497b-ac37-d42d4e947a2d\") " pod="openstack/ceilometer-0" Dec 01 20:19:08 crc kubenswrapper[4802]: I1201 20:19:08.029859 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8200b146-6478-497b-ac37-d42d4e947a2d-run-httpd\") pod \"ceilometer-0\" (UID: \"8200b146-6478-497b-ac37-d42d4e947a2d\") " pod="openstack/ceilometer-0" Dec 01 20:19:08 crc kubenswrapper[4802]: I1201 20:19:08.029909 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8200b146-6478-497b-ac37-d42d4e947a2d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8200b146-6478-497b-ac37-d42d4e947a2d\") " pod="openstack/ceilometer-0" Dec 01 20:19:08 crc kubenswrapper[4802]: I1201 20:19:08.029948 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8200b146-6478-497b-ac37-d42d4e947a2d-config-data\") pod \"ceilometer-0\" (UID: \"8200b146-6478-497b-ac37-d42d4e947a2d\") " pod="openstack/ceilometer-0" Dec 01 20:19:08 crc kubenswrapper[4802]: I1201 20:19:08.029980 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8200b146-6478-497b-ac37-d42d4e947a2d-log-httpd\") pod \"ceilometer-0\" (UID: \"8200b146-6478-497b-ac37-d42d4e947a2d\") " pod="openstack/ceilometer-0" Dec 01 20:19:08 crc kubenswrapper[4802]: I1201 20:19:08.131652 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8200b146-6478-497b-ac37-d42d4e947a2d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8200b146-6478-497b-ac37-d42d4e947a2d\") " pod="openstack/ceilometer-0" Dec 01 20:19:08 crc kubenswrapper[4802]: I1201 20:19:08.132010 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8200b146-6478-497b-ac37-d42d4e947a2d-config-data\") pod \"ceilometer-0\" (UID: \"8200b146-6478-497b-ac37-d42d4e947a2d\") " pod="openstack/ceilometer-0" Dec 01 20:19:08 crc kubenswrapper[4802]: I1201 20:19:08.132049 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8200b146-6478-497b-ac37-d42d4e947a2d-log-httpd\") pod \"ceilometer-0\" (UID: \"8200b146-6478-497b-ac37-d42d4e947a2d\") " pod="openstack/ceilometer-0" Dec 01 20:19:08 crc kubenswrapper[4802]: I1201 20:19:08.132191 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8200b146-6478-497b-ac37-d42d4e947a2d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8200b146-6478-497b-ac37-d42d4e947a2d\") " pod="openstack/ceilometer-0" Dec 01 20:19:08 crc kubenswrapper[4802]: I1201 20:19:08.132279 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n8p2\" (UniqueName: \"kubernetes.io/projected/8200b146-6478-497b-ac37-d42d4e947a2d-kube-api-access-6n8p2\") pod \"ceilometer-0\" (UID: \"8200b146-6478-497b-ac37-d42d4e947a2d\") " pod="openstack/ceilometer-0" Dec 01 20:19:08 crc kubenswrapper[4802]: I1201 20:19:08.132362 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8200b146-6478-497b-ac37-d42d4e947a2d-scripts\") pod \"ceilometer-0\" (UID: \"8200b146-6478-497b-ac37-d42d4e947a2d\") " pod="openstack/ceilometer-0" Dec 01 20:19:08 crc kubenswrapper[4802]: I1201 20:19:08.132407 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8200b146-6478-497b-ac37-d42d4e947a2d-run-httpd\") pod \"ceilometer-0\" (UID: \"8200b146-6478-497b-ac37-d42d4e947a2d\") " pod="openstack/ceilometer-0" Dec 01 20:19:08 crc kubenswrapper[4802]: I1201 20:19:08.132847 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8200b146-6478-497b-ac37-d42d4e947a2d-run-httpd\") pod \"ceilometer-0\" (UID: \"8200b146-6478-497b-ac37-d42d4e947a2d\") " pod="openstack/ceilometer-0" Dec 01 20:19:08 crc kubenswrapper[4802]: I1201 20:19:08.133391 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8200b146-6478-497b-ac37-d42d4e947a2d-log-httpd\") pod \"ceilometer-0\" (UID: \"8200b146-6478-497b-ac37-d42d4e947a2d\") " pod="openstack/ceilometer-0" Dec 01 20:19:08 crc kubenswrapper[4802]: I1201 20:19:08.140869 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8200b146-6478-497b-ac37-d42d4e947a2d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8200b146-6478-497b-ac37-d42d4e947a2d\") " pod="openstack/ceilometer-0" Dec 01 20:19:08 crc kubenswrapper[4802]: I1201 20:19:08.141050 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8200b146-6478-497b-ac37-d42d4e947a2d-scripts\") pod \"ceilometer-0\" (UID: \"8200b146-6478-497b-ac37-d42d4e947a2d\") " pod="openstack/ceilometer-0" Dec 01 20:19:08 crc kubenswrapper[4802]: I1201 20:19:08.141443 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8200b146-6478-497b-ac37-d42d4e947a2d-config-data\") pod \"ceilometer-0\" (UID: \"8200b146-6478-497b-ac37-d42d4e947a2d\") " pod="openstack/ceilometer-0" Dec 01 20:19:08 crc kubenswrapper[4802]: I1201 20:19:08.148302 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8200b146-6478-497b-ac37-d42d4e947a2d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8200b146-6478-497b-ac37-d42d4e947a2d\") " pod="openstack/ceilometer-0" Dec 01 20:19:08 crc kubenswrapper[4802]: I1201 20:19:08.163124 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n8p2\" (UniqueName: \"kubernetes.io/projected/8200b146-6478-497b-ac37-d42d4e947a2d-kube-api-access-6n8p2\") pod \"ceilometer-0\" (UID: \"8200b146-6478-497b-ac37-d42d4e947a2d\") " pod="openstack/ceilometer-0" Dec 01 20:19:08 crc kubenswrapper[4802]: I1201 20:19:08.263005 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:19:08 crc kubenswrapper[4802]: I1201 20:19:08.733956 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8049ca74-7d3d-4532-8572-d5581f5d62f2" path="/var/lib/kubelet/pods/8049ca74-7d3d-4532-8572-d5581f5d62f2/volumes" Dec 01 20:19:11 crc kubenswrapper[4802]: I1201 20:19:11.101812 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:19:12 crc kubenswrapper[4802]: I1201 20:19:12.349905 4802 scope.go:117] "RemoveContainer" containerID="6f3f98445da87c705cadba915716c356eda7725654d00a122e8493a599d8da1a" Dec 01 20:19:12 crc kubenswrapper[4802]: I1201 20:19:12.410705 4802 scope.go:117] "RemoveContainer" containerID="e21c884a589594982f3510cb07d0197280502395df387f0452d38427ea337032" Dec 01 20:19:12 crc kubenswrapper[4802]: I1201 20:19:12.565084 4802 scope.go:117] "RemoveContainer" containerID="991e84fd2c1baf37384d189746a8621ea9fe95835eb39ba3ae9df461c1ce1c7f" Dec 01 20:19:12 crc kubenswrapper[4802]: I1201 20:19:12.827428 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:19:12 crc kubenswrapper[4802]: W1201 20:19:12.857041 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8200b146_6478_497b_ac37_d42d4e947a2d.slice/crio-91f113387ba4c5b5e22e3fa7a053f0af4199cc2e5155ed478eb57f330dd0b4ab WatchSource:0}: Error finding container 91f113387ba4c5b5e22e3fa7a053f0af4199cc2e5155ed478eb57f330dd0b4ab: Status 404 returned error can't find the container with id 91f113387ba4c5b5e22e3fa7a053f0af4199cc2e5155ed478eb57f330dd0b4ab Dec 01 20:19:12 crc kubenswrapper[4802]: I1201 20:19:12.908753 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-r7wmv" event={"ID":"fc32fb72-22ab-49e4-89b3-bb5aaf4c3456","Type":"ContainerStarted","Data":"63a09e19a16e3d246e87761831ab8a56de058b847c8ffab420ed16ab34b2dc0a"} Dec 01 20:19:12 crc kubenswrapper[4802]: I1201 20:19:12.909821 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8200b146-6478-497b-ac37-d42d4e947a2d","Type":"ContainerStarted","Data":"91f113387ba4c5b5e22e3fa7a053f0af4199cc2e5155ed478eb57f330dd0b4ab"} Dec 01 20:19:12 crc kubenswrapper[4802]: I1201 20:19:12.927973 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-r7wmv" podStartSLOduration=1.5487092580000001 podStartE2EDuration="9.927954531s" podCreationTimestamp="2025-12-01 20:19:03 +0000 UTC" firstStartedPulling="2025-12-01 20:19:04.070550657 +0000 UTC m=+1365.633110298" lastFinishedPulling="2025-12-01 20:19:12.44979593 +0000 UTC m=+1374.012355571" observedRunningTime="2025-12-01 20:19:12.926384732 +0000 UTC m=+1374.488944393" watchObservedRunningTime="2025-12-01 20:19:12.927954531 +0000 UTC m=+1374.490514172" Dec 01 20:19:13 crc kubenswrapper[4802]: I1201 20:19:13.919923 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8200b146-6478-497b-ac37-d42d4e947a2d","Type":"ContainerStarted","Data":"14a7c0592ea09ea0dfae4fe98dd133b8188a470cc0d16320853d00f997f9b035"} Dec 01 20:19:14 crc kubenswrapper[4802]: I1201 20:19:14.931333 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8200b146-6478-497b-ac37-d42d4e947a2d","Type":"ContainerStarted","Data":"840a6e5d0073dac0a12f14799ce249b950f35ac1bcae5690f4deabfbd8cc694e"} Dec 01 20:19:16 crc kubenswrapper[4802]: I1201 20:19:16.955874 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8200b146-6478-497b-ac37-d42d4e947a2d","Type":"ContainerStarted","Data":"c9c3e8768ce6aeda6bb2c0a5fe7d7912d93da509e3cfaf701fd4ec19eac69d67"} Dec 01 20:19:18 crc kubenswrapper[4802]: I1201 20:19:18.977160 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8200b146-6478-497b-ac37-d42d4e947a2d","Type":"ContainerStarted","Data":"ea0d21263c769294d75f9e29851bd4892b4cbf45cba5e7f071b97a818f0a7201"} Dec 01 20:19:18 crc kubenswrapper[4802]: I1201 20:19:18.977762 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 20:19:18 crc kubenswrapper[4802]: I1201 20:19:18.977763 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8200b146-6478-497b-ac37-d42d4e947a2d" containerName="proxy-httpd" containerID="cri-o://ea0d21263c769294d75f9e29851bd4892b4cbf45cba5e7f071b97a818f0a7201" gracePeriod=30 Dec 01 20:19:18 crc kubenswrapper[4802]: I1201 20:19:18.977779 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8200b146-6478-497b-ac37-d42d4e947a2d" containerName="sg-core" containerID="cri-o://c9c3e8768ce6aeda6bb2c0a5fe7d7912d93da509e3cfaf701fd4ec19eac69d67" gracePeriod=30 Dec 01 20:19:18 crc kubenswrapper[4802]: I1201 20:19:18.977299 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8200b146-6478-497b-ac37-d42d4e947a2d" containerName="ceilometer-central-agent" containerID="cri-o://14a7c0592ea09ea0dfae4fe98dd133b8188a470cc0d16320853d00f997f9b035" gracePeriod=30 Dec 01 20:19:18 crc kubenswrapper[4802]: I1201 20:19:18.977844 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8200b146-6478-497b-ac37-d42d4e947a2d" containerName="ceilometer-notification-agent" containerID="cri-o://840a6e5d0073dac0a12f14799ce249b950f35ac1bcae5690f4deabfbd8cc694e" gracePeriod=30 Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.006107 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.729638329 podStartE2EDuration="12.006085515s" podCreationTimestamp="2025-12-01 20:19:07 +0000 UTC" firstStartedPulling="2025-12-01 20:19:12.859712408 +0000 UTC m=+1374.422272040" lastFinishedPulling="2025-12-01 20:19:18.136159585 +0000 UTC m=+1379.698719226" observedRunningTime="2025-12-01 20:19:19.004218187 +0000 UTC m=+1380.566777828" watchObservedRunningTime="2025-12-01 20:19:19.006085515 +0000 UTC m=+1380.568645166" Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.742034 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.842388 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8200b146-6478-497b-ac37-d42d4e947a2d-scripts\") pod \"8200b146-6478-497b-ac37-d42d4e947a2d\" (UID: \"8200b146-6478-497b-ac37-d42d4e947a2d\") " Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.842463 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8200b146-6478-497b-ac37-d42d4e947a2d-sg-core-conf-yaml\") pod \"8200b146-6478-497b-ac37-d42d4e947a2d\" (UID: \"8200b146-6478-497b-ac37-d42d4e947a2d\") " Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.842593 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n8p2\" (UniqueName: \"kubernetes.io/projected/8200b146-6478-497b-ac37-d42d4e947a2d-kube-api-access-6n8p2\") pod \"8200b146-6478-497b-ac37-d42d4e947a2d\" (UID: \"8200b146-6478-497b-ac37-d42d4e947a2d\") " Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.842646 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8200b146-6478-497b-ac37-d42d4e947a2d-log-httpd\") pod \"8200b146-6478-497b-ac37-d42d4e947a2d\" (UID: \"8200b146-6478-497b-ac37-d42d4e947a2d\") " Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.842739 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8200b146-6478-497b-ac37-d42d4e947a2d-combined-ca-bundle\") pod \"8200b146-6478-497b-ac37-d42d4e947a2d\" (UID: \"8200b146-6478-497b-ac37-d42d4e947a2d\") " Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.842818 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8200b146-6478-497b-ac37-d42d4e947a2d-config-data\") pod \"8200b146-6478-497b-ac37-d42d4e947a2d\" (UID: \"8200b146-6478-497b-ac37-d42d4e947a2d\") " Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.842842 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8200b146-6478-497b-ac37-d42d4e947a2d-run-httpd\") pod \"8200b146-6478-497b-ac37-d42d4e947a2d\" (UID: \"8200b146-6478-497b-ac37-d42d4e947a2d\") " Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.843389 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8200b146-6478-497b-ac37-d42d4e947a2d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8200b146-6478-497b-ac37-d42d4e947a2d" (UID: "8200b146-6478-497b-ac37-d42d4e947a2d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.843557 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8200b146-6478-497b-ac37-d42d4e947a2d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8200b146-6478-497b-ac37-d42d4e947a2d" (UID: "8200b146-6478-497b-ac37-d42d4e947a2d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.848022 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8200b146-6478-497b-ac37-d42d4e947a2d-kube-api-access-6n8p2" (OuterVolumeSpecName: "kube-api-access-6n8p2") pod "8200b146-6478-497b-ac37-d42d4e947a2d" (UID: "8200b146-6478-497b-ac37-d42d4e947a2d"). InnerVolumeSpecName "kube-api-access-6n8p2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.850174 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8200b146-6478-497b-ac37-d42d4e947a2d-scripts" (OuterVolumeSpecName: "scripts") pod "8200b146-6478-497b-ac37-d42d4e947a2d" (UID: "8200b146-6478-497b-ac37-d42d4e947a2d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.866757 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8200b146-6478-497b-ac37-d42d4e947a2d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8200b146-6478-497b-ac37-d42d4e947a2d" (UID: "8200b146-6478-497b-ac37-d42d4e947a2d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.919706 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8200b146-6478-497b-ac37-d42d4e947a2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8200b146-6478-497b-ac37-d42d4e947a2d" (UID: "8200b146-6478-497b-ac37-d42d4e947a2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.926555 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8200b146-6478-497b-ac37-d42d4e947a2d-config-data" (OuterVolumeSpecName: "config-data") pod "8200b146-6478-497b-ac37-d42d4e947a2d" (UID: "8200b146-6478-497b-ac37-d42d4e947a2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.944830 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n8p2\" (UniqueName: \"kubernetes.io/projected/8200b146-6478-497b-ac37-d42d4e947a2d-kube-api-access-6n8p2\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.944856 4802 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8200b146-6478-497b-ac37-d42d4e947a2d-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.944871 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8200b146-6478-497b-ac37-d42d4e947a2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.944883 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8200b146-6478-497b-ac37-d42d4e947a2d-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.944895 4802 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8200b146-6478-497b-ac37-d42d4e947a2d-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.944906 4802 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8200b146-6478-497b-ac37-d42d4e947a2d-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.944919 4802 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8200b146-6478-497b-ac37-d42d4e947a2d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.995375 4802 generic.go:334] "Generic (PLEG): container finished" podID="8200b146-6478-497b-ac37-d42d4e947a2d" containerID="ea0d21263c769294d75f9e29851bd4892b4cbf45cba5e7f071b97a818f0a7201" exitCode=0 Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.995407 4802 generic.go:334] "Generic (PLEG): container finished" podID="8200b146-6478-497b-ac37-d42d4e947a2d" containerID="c9c3e8768ce6aeda6bb2c0a5fe7d7912d93da509e3cfaf701fd4ec19eac69d67" exitCode=2 Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.995417 4802 generic.go:334] "Generic (PLEG): container finished" podID="8200b146-6478-497b-ac37-d42d4e947a2d" containerID="840a6e5d0073dac0a12f14799ce249b950f35ac1bcae5690f4deabfbd8cc694e" exitCode=0 Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.995425 4802 generic.go:334] "Generic (PLEG): container finished" podID="8200b146-6478-497b-ac37-d42d4e947a2d" containerID="14a7c0592ea09ea0dfae4fe98dd133b8188a470cc0d16320853d00f997f9b035" exitCode=0 Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.995446 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8200b146-6478-497b-ac37-d42d4e947a2d","Type":"ContainerDied","Data":"ea0d21263c769294d75f9e29851bd4892b4cbf45cba5e7f071b97a818f0a7201"} Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.995476 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8200b146-6478-497b-ac37-d42d4e947a2d","Type":"ContainerDied","Data":"c9c3e8768ce6aeda6bb2c0a5fe7d7912d93da509e3cfaf701fd4ec19eac69d67"} Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.995491 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8200b146-6478-497b-ac37-d42d4e947a2d","Type":"ContainerDied","Data":"840a6e5d0073dac0a12f14799ce249b950f35ac1bcae5690f4deabfbd8cc694e"} Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.995502 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8200b146-6478-497b-ac37-d42d4e947a2d","Type":"ContainerDied","Data":"14a7c0592ea09ea0dfae4fe98dd133b8188a470cc0d16320853d00f997f9b035"} Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.995513 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8200b146-6478-497b-ac37-d42d4e947a2d","Type":"ContainerDied","Data":"91f113387ba4c5b5e22e3fa7a053f0af4199cc2e5155ed478eb57f330dd0b4ab"} Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.995532 4802 scope.go:117] "RemoveContainer" containerID="ea0d21263c769294d75f9e29851bd4892b4cbf45cba5e7f071b97a818f0a7201" Dec 01 20:19:19 crc kubenswrapper[4802]: I1201 20:19:19.995695 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.025758 4802 scope.go:117] "RemoveContainer" containerID="c9c3e8768ce6aeda6bb2c0a5fe7d7912d93da509e3cfaf701fd4ec19eac69d67" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.044734 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.053990 4802 scope.go:117] "RemoveContainer" containerID="840a6e5d0073dac0a12f14799ce249b950f35ac1bcae5690f4deabfbd8cc694e" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.065056 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.079478 4802 scope.go:117] "RemoveContainer" containerID="14a7c0592ea09ea0dfae4fe98dd133b8188a470cc0d16320853d00f997f9b035" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.085112 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:19:20 crc kubenswrapper[4802]: E1201 20:19:20.085593 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8200b146-6478-497b-ac37-d42d4e947a2d" containerName="proxy-httpd" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.085702 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="8200b146-6478-497b-ac37-d42d4e947a2d" containerName="proxy-httpd" Dec 01 20:19:20 crc kubenswrapper[4802]: E1201 20:19:20.085748 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8200b146-6478-497b-ac37-d42d4e947a2d" containerName="sg-core" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.085758 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="8200b146-6478-497b-ac37-d42d4e947a2d" containerName="sg-core" Dec 01 20:19:20 crc kubenswrapper[4802]: E1201 20:19:20.085774 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8200b146-6478-497b-ac37-d42d4e947a2d" containerName="ceilometer-central-agent" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.085782 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="8200b146-6478-497b-ac37-d42d4e947a2d" containerName="ceilometer-central-agent" Dec 01 20:19:20 crc kubenswrapper[4802]: E1201 20:19:20.085804 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8200b146-6478-497b-ac37-d42d4e947a2d" containerName="ceilometer-notification-agent" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.085813 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="8200b146-6478-497b-ac37-d42d4e947a2d" containerName="ceilometer-notification-agent" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.086045 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="8200b146-6478-497b-ac37-d42d4e947a2d" containerName="proxy-httpd" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.086079 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="8200b146-6478-497b-ac37-d42d4e947a2d" containerName="ceilometer-notification-agent" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.086108 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="8200b146-6478-497b-ac37-d42d4e947a2d" containerName="sg-core" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.086133 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="8200b146-6478-497b-ac37-d42d4e947a2d" containerName="ceilometer-central-agent" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.088098 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.090311 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.090638 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.107496 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.125682 4802 scope.go:117] "RemoveContainer" containerID="ea0d21263c769294d75f9e29851bd4892b4cbf45cba5e7f071b97a818f0a7201" Dec 01 20:19:20 crc kubenswrapper[4802]: E1201 20:19:20.126168 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea0d21263c769294d75f9e29851bd4892b4cbf45cba5e7f071b97a818f0a7201\": container with ID starting with ea0d21263c769294d75f9e29851bd4892b4cbf45cba5e7f071b97a818f0a7201 not found: ID does not exist" containerID="ea0d21263c769294d75f9e29851bd4892b4cbf45cba5e7f071b97a818f0a7201" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.126219 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea0d21263c769294d75f9e29851bd4892b4cbf45cba5e7f071b97a818f0a7201"} err="failed to get container status \"ea0d21263c769294d75f9e29851bd4892b4cbf45cba5e7f071b97a818f0a7201\": rpc error: code = NotFound desc = could not find container \"ea0d21263c769294d75f9e29851bd4892b4cbf45cba5e7f071b97a818f0a7201\": container with ID starting with ea0d21263c769294d75f9e29851bd4892b4cbf45cba5e7f071b97a818f0a7201 not found: ID does not exist" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.126246 4802 scope.go:117] "RemoveContainer" containerID="c9c3e8768ce6aeda6bb2c0a5fe7d7912d93da509e3cfaf701fd4ec19eac69d67" Dec 01 20:19:20 crc kubenswrapper[4802]: E1201 20:19:20.126848 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9c3e8768ce6aeda6bb2c0a5fe7d7912d93da509e3cfaf701fd4ec19eac69d67\": container with ID starting with c9c3e8768ce6aeda6bb2c0a5fe7d7912d93da509e3cfaf701fd4ec19eac69d67 not found: ID does not exist" containerID="c9c3e8768ce6aeda6bb2c0a5fe7d7912d93da509e3cfaf701fd4ec19eac69d67" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.126906 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9c3e8768ce6aeda6bb2c0a5fe7d7912d93da509e3cfaf701fd4ec19eac69d67"} err="failed to get container status \"c9c3e8768ce6aeda6bb2c0a5fe7d7912d93da509e3cfaf701fd4ec19eac69d67\": rpc error: code = NotFound desc = could not find container \"c9c3e8768ce6aeda6bb2c0a5fe7d7912d93da509e3cfaf701fd4ec19eac69d67\": container with ID starting with c9c3e8768ce6aeda6bb2c0a5fe7d7912d93da509e3cfaf701fd4ec19eac69d67 not found: ID does not exist" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.126939 4802 scope.go:117] "RemoveContainer" containerID="840a6e5d0073dac0a12f14799ce249b950f35ac1bcae5690f4deabfbd8cc694e" Dec 01 20:19:20 crc kubenswrapper[4802]: E1201 20:19:20.127339 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"840a6e5d0073dac0a12f14799ce249b950f35ac1bcae5690f4deabfbd8cc694e\": container with ID starting with 840a6e5d0073dac0a12f14799ce249b950f35ac1bcae5690f4deabfbd8cc694e not found: ID does not exist" containerID="840a6e5d0073dac0a12f14799ce249b950f35ac1bcae5690f4deabfbd8cc694e" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.127373 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"840a6e5d0073dac0a12f14799ce249b950f35ac1bcae5690f4deabfbd8cc694e"} err="failed to get container status \"840a6e5d0073dac0a12f14799ce249b950f35ac1bcae5690f4deabfbd8cc694e\": rpc error: code = NotFound desc = could not find container \"840a6e5d0073dac0a12f14799ce249b950f35ac1bcae5690f4deabfbd8cc694e\": container with ID starting with 840a6e5d0073dac0a12f14799ce249b950f35ac1bcae5690f4deabfbd8cc694e not found: ID does not exist" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.127391 4802 scope.go:117] "RemoveContainer" containerID="14a7c0592ea09ea0dfae4fe98dd133b8188a470cc0d16320853d00f997f9b035" Dec 01 20:19:20 crc kubenswrapper[4802]: E1201 20:19:20.127661 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14a7c0592ea09ea0dfae4fe98dd133b8188a470cc0d16320853d00f997f9b035\": container with ID starting with 14a7c0592ea09ea0dfae4fe98dd133b8188a470cc0d16320853d00f997f9b035 not found: ID does not exist" containerID="14a7c0592ea09ea0dfae4fe98dd133b8188a470cc0d16320853d00f997f9b035" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.127690 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14a7c0592ea09ea0dfae4fe98dd133b8188a470cc0d16320853d00f997f9b035"} err="failed to get container status \"14a7c0592ea09ea0dfae4fe98dd133b8188a470cc0d16320853d00f997f9b035\": rpc error: code = NotFound desc = could not find container \"14a7c0592ea09ea0dfae4fe98dd133b8188a470cc0d16320853d00f997f9b035\": container with ID starting with 14a7c0592ea09ea0dfae4fe98dd133b8188a470cc0d16320853d00f997f9b035 not found: ID does not exist" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.127708 4802 scope.go:117] "RemoveContainer" containerID="ea0d21263c769294d75f9e29851bd4892b4cbf45cba5e7f071b97a818f0a7201" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.128284 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea0d21263c769294d75f9e29851bd4892b4cbf45cba5e7f071b97a818f0a7201"} err="failed to get container status \"ea0d21263c769294d75f9e29851bd4892b4cbf45cba5e7f071b97a818f0a7201\": rpc error: code = NotFound desc = could not find container \"ea0d21263c769294d75f9e29851bd4892b4cbf45cba5e7f071b97a818f0a7201\": container with ID starting with ea0d21263c769294d75f9e29851bd4892b4cbf45cba5e7f071b97a818f0a7201 not found: ID does not exist" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.128329 4802 scope.go:117] "RemoveContainer" containerID="c9c3e8768ce6aeda6bb2c0a5fe7d7912d93da509e3cfaf701fd4ec19eac69d67" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.128675 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9c3e8768ce6aeda6bb2c0a5fe7d7912d93da509e3cfaf701fd4ec19eac69d67"} err="failed to get container status \"c9c3e8768ce6aeda6bb2c0a5fe7d7912d93da509e3cfaf701fd4ec19eac69d67\": rpc error: code = NotFound desc = could not find container \"c9c3e8768ce6aeda6bb2c0a5fe7d7912d93da509e3cfaf701fd4ec19eac69d67\": container with ID starting with c9c3e8768ce6aeda6bb2c0a5fe7d7912d93da509e3cfaf701fd4ec19eac69d67 not found: ID does not exist" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.128719 4802 scope.go:117] "RemoveContainer" containerID="840a6e5d0073dac0a12f14799ce249b950f35ac1bcae5690f4deabfbd8cc694e" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.129389 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"840a6e5d0073dac0a12f14799ce249b950f35ac1bcae5690f4deabfbd8cc694e"} err="failed to get container status \"840a6e5d0073dac0a12f14799ce249b950f35ac1bcae5690f4deabfbd8cc694e\": rpc error: code = NotFound desc = could not find container \"840a6e5d0073dac0a12f14799ce249b950f35ac1bcae5690f4deabfbd8cc694e\": container with ID starting with 840a6e5d0073dac0a12f14799ce249b950f35ac1bcae5690f4deabfbd8cc694e not found: ID does not exist" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.129420 4802 scope.go:117] "RemoveContainer" containerID="14a7c0592ea09ea0dfae4fe98dd133b8188a470cc0d16320853d00f997f9b035" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.129766 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14a7c0592ea09ea0dfae4fe98dd133b8188a470cc0d16320853d00f997f9b035"} err="failed to get container status \"14a7c0592ea09ea0dfae4fe98dd133b8188a470cc0d16320853d00f997f9b035\": rpc error: code = NotFound desc = could not find container \"14a7c0592ea09ea0dfae4fe98dd133b8188a470cc0d16320853d00f997f9b035\": container with ID starting with 14a7c0592ea09ea0dfae4fe98dd133b8188a470cc0d16320853d00f997f9b035 not found: ID does not exist" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.129792 4802 scope.go:117] "RemoveContainer" containerID="ea0d21263c769294d75f9e29851bd4892b4cbf45cba5e7f071b97a818f0a7201" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.130088 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea0d21263c769294d75f9e29851bd4892b4cbf45cba5e7f071b97a818f0a7201"} err="failed to get container status \"ea0d21263c769294d75f9e29851bd4892b4cbf45cba5e7f071b97a818f0a7201\": rpc error: code = NotFound desc = could not find container \"ea0d21263c769294d75f9e29851bd4892b4cbf45cba5e7f071b97a818f0a7201\": container with ID starting with ea0d21263c769294d75f9e29851bd4892b4cbf45cba5e7f071b97a818f0a7201 not found: ID does not exist" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.130159 4802 scope.go:117] "RemoveContainer" containerID="c9c3e8768ce6aeda6bb2c0a5fe7d7912d93da509e3cfaf701fd4ec19eac69d67" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.130481 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9c3e8768ce6aeda6bb2c0a5fe7d7912d93da509e3cfaf701fd4ec19eac69d67"} err="failed to get container status \"c9c3e8768ce6aeda6bb2c0a5fe7d7912d93da509e3cfaf701fd4ec19eac69d67\": rpc error: code = NotFound desc = could not find container \"c9c3e8768ce6aeda6bb2c0a5fe7d7912d93da509e3cfaf701fd4ec19eac69d67\": container with ID starting with c9c3e8768ce6aeda6bb2c0a5fe7d7912d93da509e3cfaf701fd4ec19eac69d67 not found: ID does not exist" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.130507 4802 scope.go:117] "RemoveContainer" containerID="840a6e5d0073dac0a12f14799ce249b950f35ac1bcae5690f4deabfbd8cc694e" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.130848 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"840a6e5d0073dac0a12f14799ce249b950f35ac1bcae5690f4deabfbd8cc694e"} err="failed to get container status \"840a6e5d0073dac0a12f14799ce249b950f35ac1bcae5690f4deabfbd8cc694e\": rpc error: code = NotFound desc = could not find container \"840a6e5d0073dac0a12f14799ce249b950f35ac1bcae5690f4deabfbd8cc694e\": container with ID starting with 840a6e5d0073dac0a12f14799ce249b950f35ac1bcae5690f4deabfbd8cc694e not found: ID does not exist" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.130871 4802 scope.go:117] "RemoveContainer" containerID="14a7c0592ea09ea0dfae4fe98dd133b8188a470cc0d16320853d00f997f9b035" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.131170 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14a7c0592ea09ea0dfae4fe98dd133b8188a470cc0d16320853d00f997f9b035"} err="failed to get container status \"14a7c0592ea09ea0dfae4fe98dd133b8188a470cc0d16320853d00f997f9b035\": rpc error: code = NotFound desc = could not find container \"14a7c0592ea09ea0dfae4fe98dd133b8188a470cc0d16320853d00f997f9b035\": container with ID starting with 14a7c0592ea09ea0dfae4fe98dd133b8188a470cc0d16320853d00f997f9b035 not found: ID does not exist" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.131219 4802 scope.go:117] "RemoveContainer" containerID="ea0d21263c769294d75f9e29851bd4892b4cbf45cba5e7f071b97a818f0a7201" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.131540 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea0d21263c769294d75f9e29851bd4892b4cbf45cba5e7f071b97a818f0a7201"} err="failed to get container status \"ea0d21263c769294d75f9e29851bd4892b4cbf45cba5e7f071b97a818f0a7201\": rpc error: code = NotFound desc = could not find container \"ea0d21263c769294d75f9e29851bd4892b4cbf45cba5e7f071b97a818f0a7201\": container with ID starting with ea0d21263c769294d75f9e29851bd4892b4cbf45cba5e7f071b97a818f0a7201 not found: ID does not exist" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.131567 4802 scope.go:117] "RemoveContainer" containerID="c9c3e8768ce6aeda6bb2c0a5fe7d7912d93da509e3cfaf701fd4ec19eac69d67" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.131931 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9c3e8768ce6aeda6bb2c0a5fe7d7912d93da509e3cfaf701fd4ec19eac69d67"} err="failed to get container status \"c9c3e8768ce6aeda6bb2c0a5fe7d7912d93da509e3cfaf701fd4ec19eac69d67\": rpc error: code = NotFound desc = could not find container \"c9c3e8768ce6aeda6bb2c0a5fe7d7912d93da509e3cfaf701fd4ec19eac69d67\": container with ID starting with c9c3e8768ce6aeda6bb2c0a5fe7d7912d93da509e3cfaf701fd4ec19eac69d67 not found: ID does not exist" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.131952 4802 scope.go:117] "RemoveContainer" containerID="840a6e5d0073dac0a12f14799ce249b950f35ac1bcae5690f4deabfbd8cc694e" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.132333 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"840a6e5d0073dac0a12f14799ce249b950f35ac1bcae5690f4deabfbd8cc694e"} err="failed to get container status \"840a6e5d0073dac0a12f14799ce249b950f35ac1bcae5690f4deabfbd8cc694e\": rpc error: code = NotFound desc = could not find container \"840a6e5d0073dac0a12f14799ce249b950f35ac1bcae5690f4deabfbd8cc694e\": container with ID starting with 840a6e5d0073dac0a12f14799ce249b950f35ac1bcae5690f4deabfbd8cc694e not found: ID does not exist" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.132365 4802 scope.go:117] "RemoveContainer" containerID="14a7c0592ea09ea0dfae4fe98dd133b8188a470cc0d16320853d00f997f9b035" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.132665 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14a7c0592ea09ea0dfae4fe98dd133b8188a470cc0d16320853d00f997f9b035"} err="failed to get container status \"14a7c0592ea09ea0dfae4fe98dd133b8188a470cc0d16320853d00f997f9b035\": rpc error: code = NotFound desc = could not find container \"14a7c0592ea09ea0dfae4fe98dd133b8188a470cc0d16320853d00f997f9b035\": container with ID starting with 14a7c0592ea09ea0dfae4fe98dd133b8188a470cc0d16320853d00f997f9b035 not found: ID does not exist" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.149698 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6v64\" (UniqueName: \"kubernetes.io/projected/fcc16b80-7583-4c97-a61c-da50c8c5750c-kube-api-access-x6v64\") pod \"ceilometer-0\" (UID: \"fcc16b80-7583-4c97-a61c-da50c8c5750c\") " pod="openstack/ceilometer-0" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.149760 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fcc16b80-7583-4c97-a61c-da50c8c5750c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fcc16b80-7583-4c97-a61c-da50c8c5750c\") " pod="openstack/ceilometer-0" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.149802 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcc16b80-7583-4c97-a61c-da50c8c5750c-scripts\") pod \"ceilometer-0\" (UID: \"fcc16b80-7583-4c97-a61c-da50c8c5750c\") " pod="openstack/ceilometer-0" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.149831 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcc16b80-7583-4c97-a61c-da50c8c5750c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fcc16b80-7583-4c97-a61c-da50c8c5750c\") " pod="openstack/ceilometer-0" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.149873 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcc16b80-7583-4c97-a61c-da50c8c5750c-config-data\") pod \"ceilometer-0\" (UID: \"fcc16b80-7583-4c97-a61c-da50c8c5750c\") " pod="openstack/ceilometer-0" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.149907 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcc16b80-7583-4c97-a61c-da50c8c5750c-run-httpd\") pod \"ceilometer-0\" (UID: \"fcc16b80-7583-4c97-a61c-da50c8c5750c\") " pod="openstack/ceilometer-0" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.149965 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcc16b80-7583-4c97-a61c-da50c8c5750c-log-httpd\") pod \"ceilometer-0\" (UID: \"fcc16b80-7583-4c97-a61c-da50c8c5750c\") " pod="openstack/ceilometer-0" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.251374 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fcc16b80-7583-4c97-a61c-da50c8c5750c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fcc16b80-7583-4c97-a61c-da50c8c5750c\") " pod="openstack/ceilometer-0" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.251454 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcc16b80-7583-4c97-a61c-da50c8c5750c-scripts\") pod \"ceilometer-0\" (UID: \"fcc16b80-7583-4c97-a61c-da50c8c5750c\") " pod="openstack/ceilometer-0" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.251480 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcc16b80-7583-4c97-a61c-da50c8c5750c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fcc16b80-7583-4c97-a61c-da50c8c5750c\") " pod="openstack/ceilometer-0" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.251499 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcc16b80-7583-4c97-a61c-da50c8c5750c-config-data\") pod \"ceilometer-0\" (UID: \"fcc16b80-7583-4c97-a61c-da50c8c5750c\") " pod="openstack/ceilometer-0" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.251531 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcc16b80-7583-4c97-a61c-da50c8c5750c-run-httpd\") pod \"ceilometer-0\" (UID: \"fcc16b80-7583-4c97-a61c-da50c8c5750c\") " pod="openstack/ceilometer-0" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.251580 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcc16b80-7583-4c97-a61c-da50c8c5750c-log-httpd\") pod \"ceilometer-0\" (UID: \"fcc16b80-7583-4c97-a61c-da50c8c5750c\") " pod="openstack/ceilometer-0" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.251818 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6v64\" (UniqueName: \"kubernetes.io/projected/fcc16b80-7583-4c97-a61c-da50c8c5750c-kube-api-access-x6v64\") pod \"ceilometer-0\" (UID: \"fcc16b80-7583-4c97-a61c-da50c8c5750c\") " pod="openstack/ceilometer-0" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.252540 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcc16b80-7583-4c97-a61c-da50c8c5750c-log-httpd\") pod \"ceilometer-0\" (UID: \"fcc16b80-7583-4c97-a61c-da50c8c5750c\") " pod="openstack/ceilometer-0" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.252868 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcc16b80-7583-4c97-a61c-da50c8c5750c-run-httpd\") pod \"ceilometer-0\" (UID: \"fcc16b80-7583-4c97-a61c-da50c8c5750c\") " pod="openstack/ceilometer-0" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.256983 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fcc16b80-7583-4c97-a61c-da50c8c5750c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fcc16b80-7583-4c97-a61c-da50c8c5750c\") " pod="openstack/ceilometer-0" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.257395 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcc16b80-7583-4c97-a61c-da50c8c5750c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fcc16b80-7583-4c97-a61c-da50c8c5750c\") " pod="openstack/ceilometer-0" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.257635 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcc16b80-7583-4c97-a61c-da50c8c5750c-config-data\") pod \"ceilometer-0\" (UID: \"fcc16b80-7583-4c97-a61c-da50c8c5750c\") " pod="openstack/ceilometer-0" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.257808 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcc16b80-7583-4c97-a61c-da50c8c5750c-scripts\") pod \"ceilometer-0\" (UID: \"fcc16b80-7583-4c97-a61c-da50c8c5750c\") " pod="openstack/ceilometer-0" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.282066 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6v64\" (UniqueName: \"kubernetes.io/projected/fcc16b80-7583-4c97-a61c-da50c8c5750c-kube-api-access-x6v64\") pod \"ceilometer-0\" (UID: \"fcc16b80-7583-4c97-a61c-da50c8c5750c\") " pod="openstack/ceilometer-0" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.412034 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.734856 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8200b146-6478-497b-ac37-d42d4e947a2d" path="/var/lib/kubelet/pods/8200b146-6478-497b-ac37-d42d4e947a2d/volumes" Dec 01 20:19:20 crc kubenswrapper[4802]: I1201 20:19:20.870998 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:19:20 crc kubenswrapper[4802]: W1201 20:19:20.872850 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcc16b80_7583_4c97_a61c_da50c8c5750c.slice/crio-f7a8b04385218b18d50de25716252197c1a91f9b0fd6a4350d95d4c763201a46 WatchSource:0}: Error finding container f7a8b04385218b18d50de25716252197c1a91f9b0fd6a4350d95d4c763201a46: Status 404 returned error can't find the container with id f7a8b04385218b18d50de25716252197c1a91f9b0fd6a4350d95d4c763201a46 Dec 01 20:19:21 crc kubenswrapper[4802]: I1201 20:19:21.046992 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fcc16b80-7583-4c97-a61c-da50c8c5750c","Type":"ContainerStarted","Data":"f7a8b04385218b18d50de25716252197c1a91f9b0fd6a4350d95d4c763201a46"} Dec 01 20:19:23 crc kubenswrapper[4802]: I1201 20:19:23.069918 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fcc16b80-7583-4c97-a61c-da50c8c5750c","Type":"ContainerStarted","Data":"1d68e00244dd60b3281ef276f7771d02a85bce38a5d587b1c5adc76188f8a4ad"} Dec 01 20:19:24 crc kubenswrapper[4802]: I1201 20:19:24.083227 4802 generic.go:334] "Generic (PLEG): container finished" podID="fc32fb72-22ab-49e4-89b3-bb5aaf4c3456" containerID="63a09e19a16e3d246e87761831ab8a56de058b847c8ffab420ed16ab34b2dc0a" exitCode=0 Dec 01 20:19:24 crc kubenswrapper[4802]: I1201 20:19:24.083312 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-r7wmv" event={"ID":"fc32fb72-22ab-49e4-89b3-bb5aaf4c3456","Type":"ContainerDied","Data":"63a09e19a16e3d246e87761831ab8a56de058b847c8ffab420ed16ab34b2dc0a"} Dec 01 20:19:24 crc kubenswrapper[4802]: I1201 20:19:24.086839 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fcc16b80-7583-4c97-a61c-da50c8c5750c","Type":"ContainerStarted","Data":"81f61150219c9141ca13739b67aa8f97e7171957427aec233c9de14dc3432b3f"} Dec 01 20:19:24 crc kubenswrapper[4802]: I1201 20:19:24.086884 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fcc16b80-7583-4c97-a61c-da50c8c5750c","Type":"ContainerStarted","Data":"b993b23d61c4539ddb65c00d7e036a611b09afa4a0f1abb493c8ce447b1e4719"} Dec 01 20:19:25 crc kubenswrapper[4802]: I1201 20:19:25.545798 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-r7wmv" Dec 01 20:19:25 crc kubenswrapper[4802]: I1201 20:19:25.669927 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc32fb72-22ab-49e4-89b3-bb5aaf4c3456-config-data\") pod \"fc32fb72-22ab-49e4-89b3-bb5aaf4c3456\" (UID: \"fc32fb72-22ab-49e4-89b3-bb5aaf4c3456\") " Dec 01 20:19:25 crc kubenswrapper[4802]: I1201 20:19:25.670115 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc32fb72-22ab-49e4-89b3-bb5aaf4c3456-scripts\") pod \"fc32fb72-22ab-49e4-89b3-bb5aaf4c3456\" (UID: \"fc32fb72-22ab-49e4-89b3-bb5aaf4c3456\") " Dec 01 20:19:25 crc kubenswrapper[4802]: I1201 20:19:25.670177 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc32fb72-22ab-49e4-89b3-bb5aaf4c3456-combined-ca-bundle\") pod \"fc32fb72-22ab-49e4-89b3-bb5aaf4c3456\" (UID: \"fc32fb72-22ab-49e4-89b3-bb5aaf4c3456\") " Dec 01 20:19:25 crc kubenswrapper[4802]: I1201 20:19:25.670289 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwwl6\" (UniqueName: \"kubernetes.io/projected/fc32fb72-22ab-49e4-89b3-bb5aaf4c3456-kube-api-access-bwwl6\") pod \"fc32fb72-22ab-49e4-89b3-bb5aaf4c3456\" (UID: \"fc32fb72-22ab-49e4-89b3-bb5aaf4c3456\") " Dec 01 20:19:25 crc kubenswrapper[4802]: I1201 20:19:25.675139 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc32fb72-22ab-49e4-89b3-bb5aaf4c3456-kube-api-access-bwwl6" (OuterVolumeSpecName: "kube-api-access-bwwl6") pod "fc32fb72-22ab-49e4-89b3-bb5aaf4c3456" (UID: "fc32fb72-22ab-49e4-89b3-bb5aaf4c3456"). InnerVolumeSpecName "kube-api-access-bwwl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:19:25 crc kubenswrapper[4802]: I1201 20:19:25.676276 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc32fb72-22ab-49e4-89b3-bb5aaf4c3456-scripts" (OuterVolumeSpecName: "scripts") pod "fc32fb72-22ab-49e4-89b3-bb5aaf4c3456" (UID: "fc32fb72-22ab-49e4-89b3-bb5aaf4c3456"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:25 crc kubenswrapper[4802]: I1201 20:19:25.698154 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc32fb72-22ab-49e4-89b3-bb5aaf4c3456-config-data" (OuterVolumeSpecName: "config-data") pod "fc32fb72-22ab-49e4-89b3-bb5aaf4c3456" (UID: "fc32fb72-22ab-49e4-89b3-bb5aaf4c3456"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:25 crc kubenswrapper[4802]: I1201 20:19:25.716676 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc32fb72-22ab-49e4-89b3-bb5aaf4c3456-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc32fb72-22ab-49e4-89b3-bb5aaf4c3456" (UID: "fc32fb72-22ab-49e4-89b3-bb5aaf4c3456"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:25 crc kubenswrapper[4802]: I1201 20:19:25.772419 4802 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc32fb72-22ab-49e4-89b3-bb5aaf4c3456-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:25 crc kubenswrapper[4802]: I1201 20:19:25.772451 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc32fb72-22ab-49e4-89b3-bb5aaf4c3456-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:25 crc kubenswrapper[4802]: I1201 20:19:25.772464 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwwl6\" (UniqueName: \"kubernetes.io/projected/fc32fb72-22ab-49e4-89b3-bb5aaf4c3456-kube-api-access-bwwl6\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:25 crc kubenswrapper[4802]: I1201 20:19:25.772472 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc32fb72-22ab-49e4-89b3-bb5aaf4c3456-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:26 crc kubenswrapper[4802]: I1201 20:19:26.104501 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-r7wmv" event={"ID":"fc32fb72-22ab-49e4-89b3-bb5aaf4c3456","Type":"ContainerDied","Data":"c4b6cf022d6d89a5ea15774dba29cecf367c83fcb9a3871f2eaae11c8f4d3c3e"} Dec 01 20:19:26 crc kubenswrapper[4802]: I1201 20:19:26.104538 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4b6cf022d6d89a5ea15774dba29cecf367c83fcb9a3871f2eaae11c8f4d3c3e" Dec 01 20:19:26 crc kubenswrapper[4802]: I1201 20:19:26.104508 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-r7wmv" Dec 01 20:19:26 crc kubenswrapper[4802]: I1201 20:19:26.108109 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fcc16b80-7583-4c97-a61c-da50c8c5750c","Type":"ContainerStarted","Data":"90d3bff4233d10c687c7ed41b3773a0fa73d8e85b71b8d1d69b110c6bde5b439"} Dec 01 20:19:26 crc kubenswrapper[4802]: I1201 20:19:26.108282 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 20:19:26 crc kubenswrapper[4802]: I1201 20:19:26.150636 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.622173274 podStartE2EDuration="6.150619937s" podCreationTimestamp="2025-12-01 20:19:20 +0000 UTC" firstStartedPulling="2025-12-01 20:19:20.882461472 +0000 UTC m=+1382.445021113" lastFinishedPulling="2025-12-01 20:19:25.410908135 +0000 UTC m=+1386.973467776" observedRunningTime="2025-12-01 20:19:26.135774221 +0000 UTC m=+1387.698333862" watchObservedRunningTime="2025-12-01 20:19:26.150619937 +0000 UTC m=+1387.713179578" Dec 01 20:19:26 crc kubenswrapper[4802]: I1201 20:19:26.210856 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 20:19:26 crc kubenswrapper[4802]: E1201 20:19:26.211236 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc32fb72-22ab-49e4-89b3-bb5aaf4c3456" containerName="nova-cell0-conductor-db-sync" Dec 01 20:19:26 crc kubenswrapper[4802]: I1201 20:19:26.211252 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc32fb72-22ab-49e4-89b3-bb5aaf4c3456" containerName="nova-cell0-conductor-db-sync" Dec 01 20:19:26 crc kubenswrapper[4802]: I1201 20:19:26.211426 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc32fb72-22ab-49e4-89b3-bb5aaf4c3456" containerName="nova-cell0-conductor-db-sync" Dec 01 20:19:26 crc kubenswrapper[4802]: I1201 20:19:26.211962 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 20:19:26 crc kubenswrapper[4802]: I1201 20:19:26.214295 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-k47dl" Dec 01 20:19:26 crc kubenswrapper[4802]: I1201 20:19:26.216166 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 01 20:19:26 crc kubenswrapper[4802]: I1201 20:19:26.236268 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 20:19:26 crc kubenswrapper[4802]: I1201 20:19:26.283621 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f926810d-a46a-4504-9117-1584f02f386a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f926810d-a46a-4504-9117-1584f02f386a\") " pod="openstack/nova-cell0-conductor-0" Dec 01 20:19:26 crc kubenswrapper[4802]: I1201 20:19:26.284010 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f926810d-a46a-4504-9117-1584f02f386a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f926810d-a46a-4504-9117-1584f02f386a\") " pod="openstack/nova-cell0-conductor-0" Dec 01 20:19:26 crc kubenswrapper[4802]: I1201 20:19:26.284031 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbndw\" (UniqueName: \"kubernetes.io/projected/f926810d-a46a-4504-9117-1584f02f386a-kube-api-access-nbndw\") pod \"nova-cell0-conductor-0\" (UID: \"f926810d-a46a-4504-9117-1584f02f386a\") " pod="openstack/nova-cell0-conductor-0" Dec 01 20:19:26 crc kubenswrapper[4802]: I1201 20:19:26.386100 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f926810d-a46a-4504-9117-1584f02f386a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f926810d-a46a-4504-9117-1584f02f386a\") " pod="openstack/nova-cell0-conductor-0" Dec 01 20:19:26 crc kubenswrapper[4802]: I1201 20:19:26.386253 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f926810d-a46a-4504-9117-1584f02f386a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f926810d-a46a-4504-9117-1584f02f386a\") " pod="openstack/nova-cell0-conductor-0" Dec 01 20:19:26 crc kubenswrapper[4802]: I1201 20:19:26.386282 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbndw\" (UniqueName: \"kubernetes.io/projected/f926810d-a46a-4504-9117-1584f02f386a-kube-api-access-nbndw\") pod \"nova-cell0-conductor-0\" (UID: \"f926810d-a46a-4504-9117-1584f02f386a\") " pod="openstack/nova-cell0-conductor-0" Dec 01 20:19:26 crc kubenswrapper[4802]: I1201 20:19:26.390966 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f926810d-a46a-4504-9117-1584f02f386a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f926810d-a46a-4504-9117-1584f02f386a\") " pod="openstack/nova-cell0-conductor-0" Dec 01 20:19:26 crc kubenswrapper[4802]: I1201 20:19:26.403952 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f926810d-a46a-4504-9117-1584f02f386a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f926810d-a46a-4504-9117-1584f02f386a\") " pod="openstack/nova-cell0-conductor-0" Dec 01 20:19:26 crc kubenswrapper[4802]: I1201 20:19:26.406517 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbndw\" (UniqueName: \"kubernetes.io/projected/f926810d-a46a-4504-9117-1584f02f386a-kube-api-access-nbndw\") pod \"nova-cell0-conductor-0\" (UID: \"f926810d-a46a-4504-9117-1584f02f386a\") " pod="openstack/nova-cell0-conductor-0" Dec 01 20:19:26 crc kubenswrapper[4802]: I1201 20:19:26.530045 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 20:19:26 crc kubenswrapper[4802]: W1201 20:19:26.985995 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf926810d_a46a_4504_9117_1584f02f386a.slice/crio-f692a952b73d1053f8cf9b3780e02760da9b677a858ecfa00c0e29c0a8b6622d WatchSource:0}: Error finding container f692a952b73d1053f8cf9b3780e02760da9b677a858ecfa00c0e29c0a8b6622d: Status 404 returned error can't find the container with id f692a952b73d1053f8cf9b3780e02760da9b677a858ecfa00c0e29c0a8b6622d Dec 01 20:19:26 crc kubenswrapper[4802]: I1201 20:19:26.998230 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 20:19:27 crc kubenswrapper[4802]: I1201 20:19:27.118213 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f926810d-a46a-4504-9117-1584f02f386a","Type":"ContainerStarted","Data":"f692a952b73d1053f8cf9b3780e02760da9b677a858ecfa00c0e29c0a8b6622d"} Dec 01 20:19:28 crc kubenswrapper[4802]: I1201 20:19:28.133268 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f926810d-a46a-4504-9117-1584f02f386a","Type":"ContainerStarted","Data":"b6c5618ef1a3a06abe0ef1dbbb8ac83ed6c9a1c9b5802e1d7fd45cf217a5f39e"} Dec 01 20:19:28 crc kubenswrapper[4802]: I1201 20:19:28.134473 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 01 20:19:28 crc kubenswrapper[4802]: I1201 20:19:28.180251 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.180225734 podStartE2EDuration="2.180225734s" podCreationTimestamp="2025-12-01 20:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:19:28.168552677 +0000 UTC m=+1389.731112348" watchObservedRunningTime="2025-12-01 20:19:28.180225734 +0000 UTC m=+1389.742785395" Dec 01 20:19:36 crc kubenswrapper[4802]: I1201 20:19:36.564851 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.044176 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-x9d74"] Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.046777 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-x9d74" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.050299 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.050625 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.065353 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-x9d74"] Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.203141 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffbb23b4-8984-4d8f-8301-81c25799727d-config-data\") pod \"nova-cell0-cell-mapping-x9d74\" (UID: \"ffbb23b4-8984-4d8f-8301-81c25799727d\") " pod="openstack/nova-cell0-cell-mapping-x9d74" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.203271 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffbb23b4-8984-4d8f-8301-81c25799727d-scripts\") pod \"nova-cell0-cell-mapping-x9d74\" (UID: \"ffbb23b4-8984-4d8f-8301-81c25799727d\") " pod="openstack/nova-cell0-cell-mapping-x9d74" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.203302 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfnw7\" (UniqueName: \"kubernetes.io/projected/ffbb23b4-8984-4d8f-8301-81c25799727d-kube-api-access-qfnw7\") pod \"nova-cell0-cell-mapping-x9d74\" (UID: \"ffbb23b4-8984-4d8f-8301-81c25799727d\") " pod="openstack/nova-cell0-cell-mapping-x9d74" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.203331 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffbb23b4-8984-4d8f-8301-81c25799727d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-x9d74\" (UID: \"ffbb23b4-8984-4d8f-8301-81c25799727d\") " pod="openstack/nova-cell0-cell-mapping-x9d74" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.247795 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.259904 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.262443 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.266985 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.268174 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.271615 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.285495 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.304804 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffbb23b4-8984-4d8f-8301-81c25799727d-config-data\") pod \"nova-cell0-cell-mapping-x9d74\" (UID: \"ffbb23b4-8984-4d8f-8301-81c25799727d\") " pod="openstack/nova-cell0-cell-mapping-x9d74" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.304912 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffbb23b4-8984-4d8f-8301-81c25799727d-scripts\") pod \"nova-cell0-cell-mapping-x9d74\" (UID: \"ffbb23b4-8984-4d8f-8301-81c25799727d\") " pod="openstack/nova-cell0-cell-mapping-x9d74" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.304945 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfnw7\" (UniqueName: \"kubernetes.io/projected/ffbb23b4-8984-4d8f-8301-81c25799727d-kube-api-access-qfnw7\") pod \"nova-cell0-cell-mapping-x9d74\" (UID: \"ffbb23b4-8984-4d8f-8301-81c25799727d\") " pod="openstack/nova-cell0-cell-mapping-x9d74" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.304974 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffbb23b4-8984-4d8f-8301-81c25799727d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-x9d74\" (UID: \"ffbb23b4-8984-4d8f-8301-81c25799727d\") " pod="openstack/nova-cell0-cell-mapping-x9d74" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.312441 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffbb23b4-8984-4d8f-8301-81c25799727d-scripts\") pod \"nova-cell0-cell-mapping-x9d74\" (UID: \"ffbb23b4-8984-4d8f-8301-81c25799727d\") " pod="openstack/nova-cell0-cell-mapping-x9d74" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.324906 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffbb23b4-8984-4d8f-8301-81c25799727d-config-data\") pod \"nova-cell0-cell-mapping-x9d74\" (UID: \"ffbb23b4-8984-4d8f-8301-81c25799727d\") " pod="openstack/nova-cell0-cell-mapping-x9d74" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.328830 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffbb23b4-8984-4d8f-8301-81c25799727d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-x9d74\" (UID: \"ffbb23b4-8984-4d8f-8301-81c25799727d\") " pod="openstack/nova-cell0-cell-mapping-x9d74" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.333846 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfnw7\" (UniqueName: \"kubernetes.io/projected/ffbb23b4-8984-4d8f-8301-81c25799727d-kube-api-access-qfnw7\") pod \"nova-cell0-cell-mapping-x9d74\" (UID: \"ffbb23b4-8984-4d8f-8301-81c25799727d\") " pod="openstack/nova-cell0-cell-mapping-x9d74" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.334060 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.361325 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.362810 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.366827 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.367689 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.376745 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-x9d74" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.407352 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g46wj\" (UniqueName: \"kubernetes.io/projected/bb81d2d8-c2ae-4d47-9982-39f5f741ea34-kube-api-access-g46wj\") pod \"nova-api-0\" (UID: \"bb81d2d8-c2ae-4d47-9982-39f5f741ea34\") " pod="openstack/nova-api-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.407552 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgfj9\" (UniqueName: \"kubernetes.io/projected/4a49dde2-367a-4cd0-8bc3-60aea4c2dd73-kube-api-access-tgfj9\") pod \"nova-cell1-novncproxy-0\" (UID: \"4a49dde2-367a-4cd0-8bc3-60aea4c2dd73\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.407614 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb81d2d8-c2ae-4d47-9982-39f5f741ea34-config-data\") pod \"nova-api-0\" (UID: \"bb81d2d8-c2ae-4d47-9982-39f5f741ea34\") " pod="openstack/nova-api-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.407890 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a49dde2-367a-4cd0-8bc3-60aea4c2dd73-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4a49dde2-367a-4cd0-8bc3-60aea4c2dd73\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.408458 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb81d2d8-c2ae-4d47-9982-39f5f741ea34-logs\") pod \"nova-api-0\" (UID: \"bb81d2d8-c2ae-4d47-9982-39f5f741ea34\") " pod="openstack/nova-api-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.408688 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a49dde2-367a-4cd0-8bc3-60aea4c2dd73-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4a49dde2-367a-4cd0-8bc3-60aea4c2dd73\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.408753 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb81d2d8-c2ae-4d47-9982-39f5f741ea34-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bb81d2d8-c2ae-4d47-9982-39f5f741ea34\") " pod="openstack/nova-api-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.448322 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.449813 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.453710 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.472462 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.524244 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d747e6ae-2882-4d75-b8e3-f76d78a73dfa-logs\") pod \"nova-metadata-0\" (UID: \"d747e6ae-2882-4d75-b8e3-f76d78a73dfa\") " pod="openstack/nova-metadata-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.524555 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a49dde2-367a-4cd0-8bc3-60aea4c2dd73-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4a49dde2-367a-4cd0-8bc3-60aea4c2dd73\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.524577 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb81d2d8-c2ae-4d47-9982-39f5f741ea34-logs\") pod \"nova-api-0\" (UID: \"bb81d2d8-c2ae-4d47-9982-39f5f741ea34\") " pod="openstack/nova-api-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.524625 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d747e6ae-2882-4d75-b8e3-f76d78a73dfa-config-data\") pod \"nova-metadata-0\" (UID: \"d747e6ae-2882-4d75-b8e3-f76d78a73dfa\") " pod="openstack/nova-metadata-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.524644 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a49dde2-367a-4cd0-8bc3-60aea4c2dd73-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4a49dde2-367a-4cd0-8bc3-60aea4c2dd73\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.524660 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb81d2d8-c2ae-4d47-9982-39f5f741ea34-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bb81d2d8-c2ae-4d47-9982-39f5f741ea34\") " pod="openstack/nova-api-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.524676 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dd9x\" (UniqueName: \"kubernetes.io/projected/d747e6ae-2882-4d75-b8e3-f76d78a73dfa-kube-api-access-8dd9x\") pod \"nova-metadata-0\" (UID: \"d747e6ae-2882-4d75-b8e3-f76d78a73dfa\") " pod="openstack/nova-metadata-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.524698 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f-config-data\") pod \"nova-scheduler-0\" (UID: \"86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f\") " pod="openstack/nova-scheduler-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.524725 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g46wj\" (UniqueName: \"kubernetes.io/projected/bb81d2d8-c2ae-4d47-9982-39f5f741ea34-kube-api-access-g46wj\") pod \"nova-api-0\" (UID: \"bb81d2d8-c2ae-4d47-9982-39f5f741ea34\") " pod="openstack/nova-api-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.524773 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngq68\" (UniqueName: \"kubernetes.io/projected/86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f-kube-api-access-ngq68\") pod \"nova-scheduler-0\" (UID: \"86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f\") " pod="openstack/nova-scheduler-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.524796 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d747e6ae-2882-4d75-b8e3-f76d78a73dfa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d747e6ae-2882-4d75-b8e3-f76d78a73dfa\") " pod="openstack/nova-metadata-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.524819 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f\") " pod="openstack/nova-scheduler-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.524840 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgfj9\" (UniqueName: \"kubernetes.io/projected/4a49dde2-367a-4cd0-8bc3-60aea4c2dd73-kube-api-access-tgfj9\") pod \"nova-cell1-novncproxy-0\" (UID: \"4a49dde2-367a-4cd0-8bc3-60aea4c2dd73\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.524860 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb81d2d8-c2ae-4d47-9982-39f5f741ea34-config-data\") pod \"nova-api-0\" (UID: \"bb81d2d8-c2ae-4d47-9982-39f5f741ea34\") " pod="openstack/nova-api-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.527358 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb81d2d8-c2ae-4d47-9982-39f5f741ea34-logs\") pod \"nova-api-0\" (UID: \"bb81d2d8-c2ae-4d47-9982-39f5f741ea34\") " pod="openstack/nova-api-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.532035 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a49dde2-367a-4cd0-8bc3-60aea4c2dd73-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4a49dde2-367a-4cd0-8bc3-60aea4c2dd73\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.532472 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb81d2d8-c2ae-4d47-9982-39f5f741ea34-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bb81d2d8-c2ae-4d47-9982-39f5f741ea34\") " pod="openstack/nova-api-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.533278 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a49dde2-367a-4cd0-8bc3-60aea4c2dd73-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4a49dde2-367a-4cd0-8bc3-60aea4c2dd73\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.538315 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb81d2d8-c2ae-4d47-9982-39f5f741ea34-config-data\") pod \"nova-api-0\" (UID: \"bb81d2d8-c2ae-4d47-9982-39f5f741ea34\") " pod="openstack/nova-api-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.555761 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgfj9\" (UniqueName: \"kubernetes.io/projected/4a49dde2-367a-4cd0-8bc3-60aea4c2dd73-kube-api-access-tgfj9\") pod \"nova-cell1-novncproxy-0\" (UID: \"4a49dde2-367a-4cd0-8bc3-60aea4c2dd73\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.557010 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g46wj\" (UniqueName: \"kubernetes.io/projected/bb81d2d8-c2ae-4d47-9982-39f5f741ea34-kube-api-access-g46wj\") pod \"nova-api-0\" (UID: \"bb81d2d8-c2ae-4d47-9982-39f5f741ea34\") " pod="openstack/nova-api-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.581706 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-jptns"] Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.583337 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-jptns" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.584585 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.594063 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.603587 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-jptns"] Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.628693 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngq68\" (UniqueName: \"kubernetes.io/projected/86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f-kube-api-access-ngq68\") pod \"nova-scheduler-0\" (UID: \"86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f\") " pod="openstack/nova-scheduler-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.628754 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d747e6ae-2882-4d75-b8e3-f76d78a73dfa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d747e6ae-2882-4d75-b8e3-f76d78a73dfa\") " pod="openstack/nova-metadata-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.628799 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f\") " pod="openstack/nova-scheduler-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.628826 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llz7v\" (UniqueName: \"kubernetes.io/projected/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-kube-api-access-llz7v\") pod \"dnsmasq-dns-566b5b7845-jptns\" (UID: \"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77\") " pod="openstack/dnsmasq-dns-566b5b7845-jptns" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.628899 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-config\") pod \"dnsmasq-dns-566b5b7845-jptns\" (UID: \"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77\") " pod="openstack/dnsmasq-dns-566b5b7845-jptns" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.628922 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-dns-svc\") pod \"dnsmasq-dns-566b5b7845-jptns\" (UID: \"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77\") " pod="openstack/dnsmasq-dns-566b5b7845-jptns" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.628963 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d747e6ae-2882-4d75-b8e3-f76d78a73dfa-logs\") pod \"nova-metadata-0\" (UID: \"d747e6ae-2882-4d75-b8e3-f76d78a73dfa\") " pod="openstack/nova-metadata-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.628994 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-jptns\" (UID: \"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77\") " pod="openstack/dnsmasq-dns-566b5b7845-jptns" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.629124 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d747e6ae-2882-4d75-b8e3-f76d78a73dfa-config-data\") pod \"nova-metadata-0\" (UID: \"d747e6ae-2882-4d75-b8e3-f76d78a73dfa\") " pod="openstack/nova-metadata-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.629153 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-jptns\" (UID: \"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77\") " pod="openstack/dnsmasq-dns-566b5b7845-jptns" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.629173 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dd9x\" (UniqueName: \"kubernetes.io/projected/d747e6ae-2882-4d75-b8e3-f76d78a73dfa-kube-api-access-8dd9x\") pod \"nova-metadata-0\" (UID: \"d747e6ae-2882-4d75-b8e3-f76d78a73dfa\") " pod="openstack/nova-metadata-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.629224 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f-config-data\") pod \"nova-scheduler-0\" (UID: \"86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f\") " pod="openstack/nova-scheduler-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.631702 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d747e6ae-2882-4d75-b8e3-f76d78a73dfa-logs\") pod \"nova-metadata-0\" (UID: \"d747e6ae-2882-4d75-b8e3-f76d78a73dfa\") " pod="openstack/nova-metadata-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.633339 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f\") " pod="openstack/nova-scheduler-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.634665 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d747e6ae-2882-4d75-b8e3-f76d78a73dfa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d747e6ae-2882-4d75-b8e3-f76d78a73dfa\") " pod="openstack/nova-metadata-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.636397 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d747e6ae-2882-4d75-b8e3-f76d78a73dfa-config-data\") pod \"nova-metadata-0\" (UID: \"d747e6ae-2882-4d75-b8e3-f76d78a73dfa\") " pod="openstack/nova-metadata-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.643991 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f-config-data\") pod \"nova-scheduler-0\" (UID: \"86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f\") " pod="openstack/nova-scheduler-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.658384 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngq68\" (UniqueName: \"kubernetes.io/projected/86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f-kube-api-access-ngq68\") pod \"nova-scheduler-0\" (UID: \"86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f\") " pod="openstack/nova-scheduler-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.662556 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dd9x\" (UniqueName: \"kubernetes.io/projected/d747e6ae-2882-4d75-b8e3-f76d78a73dfa-kube-api-access-8dd9x\") pod \"nova-metadata-0\" (UID: \"d747e6ae-2882-4d75-b8e3-f76d78a73dfa\") " pod="openstack/nova-metadata-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.731766 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-config\") pod \"dnsmasq-dns-566b5b7845-jptns\" (UID: \"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77\") " pod="openstack/dnsmasq-dns-566b5b7845-jptns" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.732563 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-dns-svc\") pod \"dnsmasq-dns-566b5b7845-jptns\" (UID: \"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77\") " pod="openstack/dnsmasq-dns-566b5b7845-jptns" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.732631 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-jptns\" (UID: \"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77\") " pod="openstack/dnsmasq-dns-566b5b7845-jptns" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.732773 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-jptns\" (UID: \"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77\") " pod="openstack/dnsmasq-dns-566b5b7845-jptns" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.732866 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-config\") pod \"dnsmasq-dns-566b5b7845-jptns\" (UID: \"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77\") " pod="openstack/dnsmasq-dns-566b5b7845-jptns" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.732917 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llz7v\" (UniqueName: \"kubernetes.io/projected/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-kube-api-access-llz7v\") pod \"dnsmasq-dns-566b5b7845-jptns\" (UID: \"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77\") " pod="openstack/dnsmasq-dns-566b5b7845-jptns" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.732364 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.733461 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-dns-svc\") pod \"dnsmasq-dns-566b5b7845-jptns\" (UID: \"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77\") " pod="openstack/dnsmasq-dns-566b5b7845-jptns" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.734686 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-jptns\" (UID: \"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77\") " pod="openstack/dnsmasq-dns-566b5b7845-jptns" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.740216 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-jptns\" (UID: \"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77\") " pod="openstack/dnsmasq-dns-566b5b7845-jptns" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.761017 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llz7v\" (UniqueName: \"kubernetes.io/projected/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-kube-api-access-llz7v\") pod \"dnsmasq-dns-566b5b7845-jptns\" (UID: \"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77\") " pod="openstack/dnsmasq-dns-566b5b7845-jptns" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.915747 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 20:19:37 crc kubenswrapper[4802]: I1201 20:19:37.925964 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-jptns" Dec 01 20:19:38 crc kubenswrapper[4802]: I1201 20:19:38.022980 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-pwkmp"] Dec 01 20:19:38 crc kubenswrapper[4802]: I1201 20:19:38.024336 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-pwkmp" Dec 01 20:19:38 crc kubenswrapper[4802]: I1201 20:19:38.026519 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 01 20:19:38 crc kubenswrapper[4802]: I1201 20:19:38.026736 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 01 20:19:38 crc kubenswrapper[4802]: I1201 20:19:38.037919 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c58f6525-0009-410b-ba8d-56e7256e9d32-config-data\") pod \"nova-cell1-conductor-db-sync-pwkmp\" (UID: \"c58f6525-0009-410b-ba8d-56e7256e9d32\") " pod="openstack/nova-cell1-conductor-db-sync-pwkmp" Dec 01 20:19:38 crc kubenswrapper[4802]: I1201 20:19:38.037992 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c58f6525-0009-410b-ba8d-56e7256e9d32-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-pwkmp\" (UID: \"c58f6525-0009-410b-ba8d-56e7256e9d32\") " pod="openstack/nova-cell1-conductor-db-sync-pwkmp" Dec 01 20:19:38 crc kubenswrapper[4802]: I1201 20:19:38.038021 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c58f6525-0009-410b-ba8d-56e7256e9d32-scripts\") pod \"nova-cell1-conductor-db-sync-pwkmp\" (UID: \"c58f6525-0009-410b-ba8d-56e7256e9d32\") " pod="openstack/nova-cell1-conductor-db-sync-pwkmp" Dec 01 20:19:38 crc kubenswrapper[4802]: I1201 20:19:38.038057 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-pwkmp"] Dec 01 20:19:38 crc kubenswrapper[4802]: I1201 20:19:38.038071 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxm5w\" (UniqueName: \"kubernetes.io/projected/c58f6525-0009-410b-ba8d-56e7256e9d32-kube-api-access-fxm5w\") pod \"nova-cell1-conductor-db-sync-pwkmp\" (UID: \"c58f6525-0009-410b-ba8d-56e7256e9d32\") " pod="openstack/nova-cell1-conductor-db-sync-pwkmp" Dec 01 20:19:38 crc kubenswrapper[4802]: I1201 20:19:38.072776 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-x9d74"] Dec 01 20:19:38 crc kubenswrapper[4802]: I1201 20:19:38.143073 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c58f6525-0009-410b-ba8d-56e7256e9d32-config-data\") pod \"nova-cell1-conductor-db-sync-pwkmp\" (UID: \"c58f6525-0009-410b-ba8d-56e7256e9d32\") " pod="openstack/nova-cell1-conductor-db-sync-pwkmp" Dec 01 20:19:38 crc kubenswrapper[4802]: I1201 20:19:38.171069 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c58f6525-0009-410b-ba8d-56e7256e9d32-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-pwkmp\" (UID: \"c58f6525-0009-410b-ba8d-56e7256e9d32\") " pod="openstack/nova-cell1-conductor-db-sync-pwkmp" Dec 01 20:19:38 crc kubenswrapper[4802]: I1201 20:19:38.171150 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c58f6525-0009-410b-ba8d-56e7256e9d32-scripts\") pod \"nova-cell1-conductor-db-sync-pwkmp\" (UID: \"c58f6525-0009-410b-ba8d-56e7256e9d32\") " pod="openstack/nova-cell1-conductor-db-sync-pwkmp" Dec 01 20:19:38 crc kubenswrapper[4802]: I1201 20:19:38.171283 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxm5w\" (UniqueName: \"kubernetes.io/projected/c58f6525-0009-410b-ba8d-56e7256e9d32-kube-api-access-fxm5w\") pod \"nova-cell1-conductor-db-sync-pwkmp\" (UID: \"c58f6525-0009-410b-ba8d-56e7256e9d32\") " pod="openstack/nova-cell1-conductor-db-sync-pwkmp" Dec 01 20:19:38 crc kubenswrapper[4802]: I1201 20:19:38.239343 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c58f6525-0009-410b-ba8d-56e7256e9d32-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-pwkmp\" (UID: \"c58f6525-0009-410b-ba8d-56e7256e9d32\") " pod="openstack/nova-cell1-conductor-db-sync-pwkmp" Dec 01 20:19:38 crc kubenswrapper[4802]: I1201 20:19:38.246274 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 20:19:38 crc kubenswrapper[4802]: I1201 20:19:38.247099 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c58f6525-0009-410b-ba8d-56e7256e9d32-scripts\") pod \"nova-cell1-conductor-db-sync-pwkmp\" (UID: \"c58f6525-0009-410b-ba8d-56e7256e9d32\") " pod="openstack/nova-cell1-conductor-db-sync-pwkmp" Dec 01 20:19:38 crc kubenswrapper[4802]: I1201 20:19:38.249018 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c58f6525-0009-410b-ba8d-56e7256e9d32-config-data\") pod \"nova-cell1-conductor-db-sync-pwkmp\" (UID: \"c58f6525-0009-410b-ba8d-56e7256e9d32\") " pod="openstack/nova-cell1-conductor-db-sync-pwkmp" Dec 01 20:19:38 crc kubenswrapper[4802]: I1201 20:19:38.295408 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-x9d74" event={"ID":"ffbb23b4-8984-4d8f-8301-81c25799727d","Type":"ContainerStarted","Data":"08a05b677b3a2b66d82cf91ce300d9f7fbf9df5e4138d8f2142b047d64c6b3ac"} Dec 01 20:19:38 crc kubenswrapper[4802]: I1201 20:19:38.297300 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxm5w\" (UniqueName: \"kubernetes.io/projected/c58f6525-0009-410b-ba8d-56e7256e9d32-kube-api-access-fxm5w\") pod \"nova-cell1-conductor-db-sync-pwkmp\" (UID: \"c58f6525-0009-410b-ba8d-56e7256e9d32\") " pod="openstack/nova-cell1-conductor-db-sync-pwkmp" Dec 01 20:19:38 crc kubenswrapper[4802]: I1201 20:19:38.299548 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 20:19:38 crc kubenswrapper[4802]: W1201 20:19:38.346256 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb81d2d8_c2ae_4d47_9982_39f5f741ea34.slice/crio-8e79f5265c065e15098e142f046e6ceff341953befa9aa83ffecf925ccacfdde WatchSource:0}: Error finding container 8e79f5265c065e15098e142f046e6ceff341953befa9aa83ffecf925ccacfdde: Status 404 returned error can't find the container with id 8e79f5265c065e15098e142f046e6ceff341953befa9aa83ffecf925ccacfdde Dec 01 20:19:38 crc kubenswrapper[4802]: I1201 20:19:38.366956 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-pwkmp" Dec 01 20:19:38 crc kubenswrapper[4802]: I1201 20:19:38.549845 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 20:19:38 crc kubenswrapper[4802]: W1201 20:19:38.662022 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd747e6ae_2882_4d75_b8e3_f76d78a73dfa.slice/crio-1ad88db8b9e11703e55864a6737c78823457693a6fcfd9fa5216adeb5772de3e WatchSource:0}: Error finding container 1ad88db8b9e11703e55864a6737c78823457693a6fcfd9fa5216adeb5772de3e: Status 404 returned error can't find the container with id 1ad88db8b9e11703e55864a6737c78823457693a6fcfd9fa5216adeb5772de3e Dec 01 20:19:38 crc kubenswrapper[4802]: I1201 20:19:38.664971 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 20:19:38 crc kubenswrapper[4802]: I1201 20:19:38.763701 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-jptns"] Dec 01 20:19:38 crc kubenswrapper[4802]: W1201 20:19:38.763826 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d4d4c6a_d6b8_4f83_b7ea_47757fc32c77.slice/crio-1fa123b3ed1ec5796dc2abf9c397e80f677a628e5c480069188c9809b3fe4d74 WatchSource:0}: Error finding container 1fa123b3ed1ec5796dc2abf9c397e80f677a628e5c480069188c9809b3fe4d74: Status 404 returned error can't find the container with id 1fa123b3ed1ec5796dc2abf9c397e80f677a628e5c480069188c9809b3fe4d74 Dec 01 20:19:38 crc kubenswrapper[4802]: I1201 20:19:38.885981 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-pwkmp"] Dec 01 20:19:39 crc kubenswrapper[4802]: I1201 20:19:39.306654 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d747e6ae-2882-4d75-b8e3-f76d78a73dfa","Type":"ContainerStarted","Data":"1ad88db8b9e11703e55864a6737c78823457693a6fcfd9fa5216adeb5772de3e"} Dec 01 20:19:39 crc kubenswrapper[4802]: I1201 20:19:39.308668 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-pwkmp" event={"ID":"c58f6525-0009-410b-ba8d-56e7256e9d32","Type":"ContainerStarted","Data":"280ebc12bc9cce6b2e9e70d1c3369d3c6a845c79e2cdc14e43e4779917952ab0"} Dec 01 20:19:39 crc kubenswrapper[4802]: I1201 20:19:39.308738 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-pwkmp" event={"ID":"c58f6525-0009-410b-ba8d-56e7256e9d32","Type":"ContainerStarted","Data":"3a318180d85119e90879a025d1af35c53ed71772497dfcd6169baa7513873218"} Dec 01 20:19:39 crc kubenswrapper[4802]: I1201 20:19:39.310042 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4a49dde2-367a-4cd0-8bc3-60aea4c2dd73","Type":"ContainerStarted","Data":"44f151a28426228d7a60a28ab4d6f79b3fc6239f16837b31bc7f71a4848c90cf"} Dec 01 20:19:39 crc kubenswrapper[4802]: I1201 20:19:39.311635 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-x9d74" event={"ID":"ffbb23b4-8984-4d8f-8301-81c25799727d","Type":"ContainerStarted","Data":"bc1456a0928d6b42f7cc78973ebf42570d5bd634528b094e894c37af799ab3a8"} Dec 01 20:19:39 crc kubenswrapper[4802]: I1201 20:19:39.313787 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f","Type":"ContainerStarted","Data":"5af0820e5f322d212e899dd79eb7d3dfd2f074f5a35ac70aab836d919cfa58c4"} Dec 01 20:19:39 crc kubenswrapper[4802]: I1201 20:19:39.315064 4802 generic.go:334] "Generic (PLEG): container finished" podID="3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77" containerID="226fdb1c67d12dfdad0ec2abbe87cc5361b24598a0f081753aede42b382fd5a2" exitCode=0 Dec 01 20:19:39 crc kubenswrapper[4802]: I1201 20:19:39.315117 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-jptns" event={"ID":"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77","Type":"ContainerDied","Data":"226fdb1c67d12dfdad0ec2abbe87cc5361b24598a0f081753aede42b382fd5a2"} Dec 01 20:19:39 crc kubenswrapper[4802]: I1201 20:19:39.315159 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-jptns" event={"ID":"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77","Type":"ContainerStarted","Data":"1fa123b3ed1ec5796dc2abf9c397e80f677a628e5c480069188c9809b3fe4d74"} Dec 01 20:19:39 crc kubenswrapper[4802]: I1201 20:19:39.316091 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bb81d2d8-c2ae-4d47-9982-39f5f741ea34","Type":"ContainerStarted","Data":"8e79f5265c065e15098e142f046e6ceff341953befa9aa83ffecf925ccacfdde"} Dec 01 20:19:39 crc kubenswrapper[4802]: I1201 20:19:39.335653 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-pwkmp" podStartSLOduration=2.335634401 podStartE2EDuration="2.335634401s" podCreationTimestamp="2025-12-01 20:19:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:19:39.325882205 +0000 UTC m=+1400.888441866" watchObservedRunningTime="2025-12-01 20:19:39.335634401 +0000 UTC m=+1400.898194042" Dec 01 20:19:39 crc kubenswrapper[4802]: I1201 20:19:39.383187 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-x9d74" podStartSLOduration=2.383171633 podStartE2EDuration="2.383171633s" podCreationTimestamp="2025-12-01 20:19:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:19:39.378667462 +0000 UTC m=+1400.941227103" watchObservedRunningTime="2025-12-01 20:19:39.383171633 +0000 UTC m=+1400.945731274" Dec 01 20:19:40 crc kubenswrapper[4802]: I1201 20:19:40.803031 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 20:19:40 crc kubenswrapper[4802]: I1201 20:19:40.843983 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 20:19:42 crc kubenswrapper[4802]: I1201 20:19:42.346392 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-jptns" event={"ID":"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77","Type":"ContainerStarted","Data":"3880fcb5137c8753165a638efde9a0ef144b59ee00f80d5b9745fe8731b8f34e"} Dec 01 20:19:42 crc kubenswrapper[4802]: I1201 20:19:42.347010 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-566b5b7845-jptns" Dec 01 20:19:42 crc kubenswrapper[4802]: I1201 20:19:42.351693 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bb81d2d8-c2ae-4d47-9982-39f5f741ea34","Type":"ContainerStarted","Data":"640c446f6f44f751c1b5c4a7ed448d996d7c1cc259a9044544eb481b5053a51a"} Dec 01 20:19:42 crc kubenswrapper[4802]: I1201 20:19:42.351780 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bb81d2d8-c2ae-4d47-9982-39f5f741ea34","Type":"ContainerStarted","Data":"de0cee10a375b83b7be675e7e77a0cdc76f28973acb0d1866a2d079b7973061b"} Dec 01 20:19:42 crc kubenswrapper[4802]: I1201 20:19:42.353865 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d747e6ae-2882-4d75-b8e3-f76d78a73dfa","Type":"ContainerStarted","Data":"b80eb9a798cb0da5035617649871b5aa64d629263a6fc2c3db57c0fe9b96aff8"} Dec 01 20:19:42 crc kubenswrapper[4802]: I1201 20:19:42.353905 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d747e6ae-2882-4d75-b8e3-f76d78a73dfa","Type":"ContainerStarted","Data":"ca88fbce5f087312d17fdd726476ac36ac0c1f72243d1f5ffdf80a84963c9601"} Dec 01 20:19:42 crc kubenswrapper[4802]: I1201 20:19:42.354081 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d747e6ae-2882-4d75-b8e3-f76d78a73dfa" containerName="nova-metadata-log" containerID="cri-o://ca88fbce5f087312d17fdd726476ac36ac0c1f72243d1f5ffdf80a84963c9601" gracePeriod=30 Dec 01 20:19:42 crc kubenswrapper[4802]: I1201 20:19:42.354114 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d747e6ae-2882-4d75-b8e3-f76d78a73dfa" containerName="nova-metadata-metadata" containerID="cri-o://b80eb9a798cb0da5035617649871b5aa64d629263a6fc2c3db57c0fe9b96aff8" gracePeriod=30 Dec 01 20:19:42 crc kubenswrapper[4802]: I1201 20:19:42.357090 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4a49dde2-367a-4cd0-8bc3-60aea4c2dd73","Type":"ContainerStarted","Data":"efb7abd5a3d6121fa75b809649c51c4cbfce30be0a3eefea3319993626982dcd"} Dec 01 20:19:42 crc kubenswrapper[4802]: I1201 20:19:42.357228 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="4a49dde2-367a-4cd0-8bc3-60aea4c2dd73" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://efb7abd5a3d6121fa75b809649c51c4cbfce30be0a3eefea3319993626982dcd" gracePeriod=30 Dec 01 20:19:42 crc kubenswrapper[4802]: I1201 20:19:42.373663 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f","Type":"ContainerStarted","Data":"1e7709c523496aefd7526add378be71075823eb5ad96c6c9c022c9da0856e3a6"} Dec 01 20:19:42 crc kubenswrapper[4802]: I1201 20:19:42.381555 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-566b5b7845-jptns" podStartSLOduration=5.381533402 podStartE2EDuration="5.381533402s" podCreationTimestamp="2025-12-01 20:19:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:19:42.370931599 +0000 UTC m=+1403.933491250" watchObservedRunningTime="2025-12-01 20:19:42.381533402 +0000 UTC m=+1403.944093063" Dec 01 20:19:42 crc kubenswrapper[4802]: I1201 20:19:42.401941 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.536188997 podStartE2EDuration="5.401925052s" podCreationTimestamp="2025-12-01 20:19:37 +0000 UTC" firstStartedPulling="2025-12-01 20:19:38.351293049 +0000 UTC m=+1399.913852690" lastFinishedPulling="2025-12-01 20:19:41.217029104 +0000 UTC m=+1402.779588745" observedRunningTime="2025-12-01 20:19:42.394802919 +0000 UTC m=+1403.957362560" watchObservedRunningTime="2025-12-01 20:19:42.401925052 +0000 UTC m=+1403.964484693" Dec 01 20:19:42 crc kubenswrapper[4802]: I1201 20:19:42.428837 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.594659462 podStartE2EDuration="5.428821056s" podCreationTimestamp="2025-12-01 20:19:37 +0000 UTC" firstStartedPulling="2025-12-01 20:19:38.35260418 +0000 UTC m=+1399.915163821" lastFinishedPulling="2025-12-01 20:19:41.186765774 +0000 UTC m=+1402.749325415" observedRunningTime="2025-12-01 20:19:42.421690423 +0000 UTC m=+1403.984250054" watchObservedRunningTime="2025-12-01 20:19:42.428821056 +0000 UTC m=+1403.991380697" Dec 01 20:19:42 crc kubenswrapper[4802]: I1201 20:19:42.450096 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.910368704 podStartE2EDuration="5.450078734s" podCreationTimestamp="2025-12-01 20:19:37 +0000 UTC" firstStartedPulling="2025-12-01 20:19:38.671017127 +0000 UTC m=+1400.233576768" lastFinishedPulling="2025-12-01 20:19:41.210727147 +0000 UTC m=+1402.773286798" observedRunningTime="2025-12-01 20:19:42.443848938 +0000 UTC m=+1404.006408589" watchObservedRunningTime="2025-12-01 20:19:42.450078734 +0000 UTC m=+1404.012638375" Dec 01 20:19:42 crc kubenswrapper[4802]: I1201 20:19:42.461840 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.816125726 podStartE2EDuration="5.461818983s" podCreationTimestamp="2025-12-01 20:19:37 +0000 UTC" firstStartedPulling="2025-12-01 20:19:38.56218139 +0000 UTC m=+1400.124741031" lastFinishedPulling="2025-12-01 20:19:41.207874647 +0000 UTC m=+1402.770434288" observedRunningTime="2025-12-01 20:19:42.457459836 +0000 UTC m=+1404.020019477" watchObservedRunningTime="2025-12-01 20:19:42.461818983 +0000 UTC m=+1404.024378624" Dec 01 20:19:42 crc kubenswrapper[4802]: I1201 20:19:42.595172 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:19:42 crc kubenswrapper[4802]: I1201 20:19:42.749279 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 20:19:42 crc kubenswrapper[4802]: I1201 20:19:42.917056 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 20:19:42 crc kubenswrapper[4802]: I1201 20:19:42.917109 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 20:19:42 crc kubenswrapper[4802]: I1201 20:19:42.947929 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 20:19:42 crc kubenswrapper[4802]: I1201 20:19:42.989341 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d747e6ae-2882-4d75-b8e3-f76d78a73dfa-logs\") pod \"d747e6ae-2882-4d75-b8e3-f76d78a73dfa\" (UID: \"d747e6ae-2882-4d75-b8e3-f76d78a73dfa\") " Dec 01 20:19:42 crc kubenswrapper[4802]: I1201 20:19:42.989444 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d747e6ae-2882-4d75-b8e3-f76d78a73dfa-combined-ca-bundle\") pod \"d747e6ae-2882-4d75-b8e3-f76d78a73dfa\" (UID: \"d747e6ae-2882-4d75-b8e3-f76d78a73dfa\") " Dec 01 20:19:42 crc kubenswrapper[4802]: I1201 20:19:42.989520 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dd9x\" (UniqueName: \"kubernetes.io/projected/d747e6ae-2882-4d75-b8e3-f76d78a73dfa-kube-api-access-8dd9x\") pod \"d747e6ae-2882-4d75-b8e3-f76d78a73dfa\" (UID: \"d747e6ae-2882-4d75-b8e3-f76d78a73dfa\") " Dec 01 20:19:42 crc kubenswrapper[4802]: I1201 20:19:42.989620 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d747e6ae-2882-4d75-b8e3-f76d78a73dfa-config-data\") pod \"d747e6ae-2882-4d75-b8e3-f76d78a73dfa\" (UID: \"d747e6ae-2882-4d75-b8e3-f76d78a73dfa\") " Dec 01 20:19:42 crc kubenswrapper[4802]: I1201 20:19:42.989793 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d747e6ae-2882-4d75-b8e3-f76d78a73dfa-logs" (OuterVolumeSpecName: "logs") pod "d747e6ae-2882-4d75-b8e3-f76d78a73dfa" (UID: "d747e6ae-2882-4d75-b8e3-f76d78a73dfa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:19:42 crc kubenswrapper[4802]: I1201 20:19:42.990284 4802 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d747e6ae-2882-4d75-b8e3-f76d78a73dfa-logs\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:42 crc kubenswrapper[4802]: I1201 20:19:42.996361 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d747e6ae-2882-4d75-b8e3-f76d78a73dfa-kube-api-access-8dd9x" (OuterVolumeSpecName: "kube-api-access-8dd9x") pod "d747e6ae-2882-4d75-b8e3-f76d78a73dfa" (UID: "d747e6ae-2882-4d75-b8e3-f76d78a73dfa"). InnerVolumeSpecName "kube-api-access-8dd9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.020330 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d747e6ae-2882-4d75-b8e3-f76d78a73dfa-config-data" (OuterVolumeSpecName: "config-data") pod "d747e6ae-2882-4d75-b8e3-f76d78a73dfa" (UID: "d747e6ae-2882-4d75-b8e3-f76d78a73dfa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.020520 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d747e6ae-2882-4d75-b8e3-f76d78a73dfa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d747e6ae-2882-4d75-b8e3-f76d78a73dfa" (UID: "d747e6ae-2882-4d75-b8e3-f76d78a73dfa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.092751 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dd9x\" (UniqueName: \"kubernetes.io/projected/d747e6ae-2882-4d75-b8e3-f76d78a73dfa-kube-api-access-8dd9x\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.092796 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d747e6ae-2882-4d75-b8e3-f76d78a73dfa-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.092806 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d747e6ae-2882-4d75-b8e3-f76d78a73dfa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.386742 4802 generic.go:334] "Generic (PLEG): container finished" podID="d747e6ae-2882-4d75-b8e3-f76d78a73dfa" containerID="b80eb9a798cb0da5035617649871b5aa64d629263a6fc2c3db57c0fe9b96aff8" exitCode=0 Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.386773 4802 generic.go:334] "Generic (PLEG): container finished" podID="d747e6ae-2882-4d75-b8e3-f76d78a73dfa" containerID="ca88fbce5f087312d17fdd726476ac36ac0c1f72243d1f5ffdf80a84963c9601" exitCode=143 Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.386791 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d747e6ae-2882-4d75-b8e3-f76d78a73dfa","Type":"ContainerDied","Data":"b80eb9a798cb0da5035617649871b5aa64d629263a6fc2c3db57c0fe9b96aff8"} Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.386848 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.386875 4802 scope.go:117] "RemoveContainer" containerID="b80eb9a798cb0da5035617649871b5aa64d629263a6fc2c3db57c0fe9b96aff8" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.386862 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d747e6ae-2882-4d75-b8e3-f76d78a73dfa","Type":"ContainerDied","Data":"ca88fbce5f087312d17fdd726476ac36ac0c1f72243d1f5ffdf80a84963c9601"} Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.386923 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d747e6ae-2882-4d75-b8e3-f76d78a73dfa","Type":"ContainerDied","Data":"1ad88db8b9e11703e55864a6737c78823457693a6fcfd9fa5216adeb5772de3e"} Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.419732 4802 scope.go:117] "RemoveContainer" containerID="ca88fbce5f087312d17fdd726476ac36ac0c1f72243d1f5ffdf80a84963c9601" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.453548 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.469941 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.474287 4802 scope.go:117] "RemoveContainer" containerID="b80eb9a798cb0da5035617649871b5aa64d629263a6fc2c3db57c0fe9b96aff8" Dec 01 20:19:43 crc kubenswrapper[4802]: E1201 20:19:43.489656 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b80eb9a798cb0da5035617649871b5aa64d629263a6fc2c3db57c0fe9b96aff8\": container with ID starting with b80eb9a798cb0da5035617649871b5aa64d629263a6fc2c3db57c0fe9b96aff8 not found: ID does not exist" containerID="b80eb9a798cb0da5035617649871b5aa64d629263a6fc2c3db57c0fe9b96aff8" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.489821 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b80eb9a798cb0da5035617649871b5aa64d629263a6fc2c3db57c0fe9b96aff8"} err="failed to get container status \"b80eb9a798cb0da5035617649871b5aa64d629263a6fc2c3db57c0fe9b96aff8\": rpc error: code = NotFound desc = could not find container \"b80eb9a798cb0da5035617649871b5aa64d629263a6fc2c3db57c0fe9b96aff8\": container with ID starting with b80eb9a798cb0da5035617649871b5aa64d629263a6fc2c3db57c0fe9b96aff8 not found: ID does not exist" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.489882 4802 scope.go:117] "RemoveContainer" containerID="ca88fbce5f087312d17fdd726476ac36ac0c1f72243d1f5ffdf80a84963c9601" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.491066 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 20:19:43 crc kubenswrapper[4802]: E1201 20:19:43.491381 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca88fbce5f087312d17fdd726476ac36ac0c1f72243d1f5ffdf80a84963c9601\": container with ID starting with ca88fbce5f087312d17fdd726476ac36ac0c1f72243d1f5ffdf80a84963c9601 not found: ID does not exist" containerID="ca88fbce5f087312d17fdd726476ac36ac0c1f72243d1f5ffdf80a84963c9601" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.491459 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca88fbce5f087312d17fdd726476ac36ac0c1f72243d1f5ffdf80a84963c9601"} err="failed to get container status \"ca88fbce5f087312d17fdd726476ac36ac0c1f72243d1f5ffdf80a84963c9601\": rpc error: code = NotFound desc = could not find container \"ca88fbce5f087312d17fdd726476ac36ac0c1f72243d1f5ffdf80a84963c9601\": container with ID starting with ca88fbce5f087312d17fdd726476ac36ac0c1f72243d1f5ffdf80a84963c9601 not found: ID does not exist" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.491487 4802 scope.go:117] "RemoveContainer" containerID="b80eb9a798cb0da5035617649871b5aa64d629263a6fc2c3db57c0fe9b96aff8" Dec 01 20:19:43 crc kubenswrapper[4802]: E1201 20:19:43.491823 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d747e6ae-2882-4d75-b8e3-f76d78a73dfa" containerName="nova-metadata-log" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.491841 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="d747e6ae-2882-4d75-b8e3-f76d78a73dfa" containerName="nova-metadata-log" Dec 01 20:19:43 crc kubenswrapper[4802]: E1201 20:19:43.491869 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d747e6ae-2882-4d75-b8e3-f76d78a73dfa" containerName="nova-metadata-metadata" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.491876 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="d747e6ae-2882-4d75-b8e3-f76d78a73dfa" containerName="nova-metadata-metadata" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.492264 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="d747e6ae-2882-4d75-b8e3-f76d78a73dfa" containerName="nova-metadata-log" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.492296 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="d747e6ae-2882-4d75-b8e3-f76d78a73dfa" containerName="nova-metadata-metadata" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.493474 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b80eb9a798cb0da5035617649871b5aa64d629263a6fc2c3db57c0fe9b96aff8"} err="failed to get container status \"b80eb9a798cb0da5035617649871b5aa64d629263a6fc2c3db57c0fe9b96aff8\": rpc error: code = NotFound desc = could not find container \"b80eb9a798cb0da5035617649871b5aa64d629263a6fc2c3db57c0fe9b96aff8\": container with ID starting with b80eb9a798cb0da5035617649871b5aa64d629263a6fc2c3db57c0fe9b96aff8 not found: ID does not exist" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.493502 4802 scope.go:117] "RemoveContainer" containerID="ca88fbce5f087312d17fdd726476ac36ac0c1f72243d1f5ffdf80a84963c9601" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.493732 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.493916 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca88fbce5f087312d17fdd726476ac36ac0c1f72243d1f5ffdf80a84963c9601"} err="failed to get container status \"ca88fbce5f087312d17fdd726476ac36ac0c1f72243d1f5ffdf80a84963c9601\": rpc error: code = NotFound desc = could not find container \"ca88fbce5f087312d17fdd726476ac36ac0c1f72243d1f5ffdf80a84963c9601\": container with ID starting with ca88fbce5f087312d17fdd726476ac36ac0c1f72243d1f5ffdf80a84963c9601 not found: ID does not exist" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.496692 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.496819 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.504060 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.600723 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfc019a1-c736-435c-90e3-b3e12b44d304-config-data\") pod \"nova-metadata-0\" (UID: \"bfc019a1-c736-435c-90e3-b3e12b44d304\") " pod="openstack/nova-metadata-0" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.600809 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndk4t\" (UniqueName: \"kubernetes.io/projected/bfc019a1-c736-435c-90e3-b3e12b44d304-kube-api-access-ndk4t\") pod \"nova-metadata-0\" (UID: \"bfc019a1-c736-435c-90e3-b3e12b44d304\") " pod="openstack/nova-metadata-0" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.600835 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfc019a1-c736-435c-90e3-b3e12b44d304-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bfc019a1-c736-435c-90e3-b3e12b44d304\") " pod="openstack/nova-metadata-0" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.600918 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc019a1-c736-435c-90e3-b3e12b44d304-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bfc019a1-c736-435c-90e3-b3e12b44d304\") " pod="openstack/nova-metadata-0" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.600956 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfc019a1-c736-435c-90e3-b3e12b44d304-logs\") pod \"nova-metadata-0\" (UID: \"bfc019a1-c736-435c-90e3-b3e12b44d304\") " pod="openstack/nova-metadata-0" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.703041 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc019a1-c736-435c-90e3-b3e12b44d304-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bfc019a1-c736-435c-90e3-b3e12b44d304\") " pod="openstack/nova-metadata-0" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.703127 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfc019a1-c736-435c-90e3-b3e12b44d304-logs\") pod \"nova-metadata-0\" (UID: \"bfc019a1-c736-435c-90e3-b3e12b44d304\") " pod="openstack/nova-metadata-0" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.703253 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfc019a1-c736-435c-90e3-b3e12b44d304-config-data\") pod \"nova-metadata-0\" (UID: \"bfc019a1-c736-435c-90e3-b3e12b44d304\") " pod="openstack/nova-metadata-0" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.703310 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndk4t\" (UniqueName: \"kubernetes.io/projected/bfc019a1-c736-435c-90e3-b3e12b44d304-kube-api-access-ndk4t\") pod \"nova-metadata-0\" (UID: \"bfc019a1-c736-435c-90e3-b3e12b44d304\") " pod="openstack/nova-metadata-0" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.703332 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfc019a1-c736-435c-90e3-b3e12b44d304-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bfc019a1-c736-435c-90e3-b3e12b44d304\") " pod="openstack/nova-metadata-0" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.703991 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfc019a1-c736-435c-90e3-b3e12b44d304-logs\") pod \"nova-metadata-0\" (UID: \"bfc019a1-c736-435c-90e3-b3e12b44d304\") " pod="openstack/nova-metadata-0" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.708894 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc019a1-c736-435c-90e3-b3e12b44d304-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bfc019a1-c736-435c-90e3-b3e12b44d304\") " pod="openstack/nova-metadata-0" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.708965 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfc019a1-c736-435c-90e3-b3e12b44d304-config-data\") pod \"nova-metadata-0\" (UID: \"bfc019a1-c736-435c-90e3-b3e12b44d304\") " pod="openstack/nova-metadata-0" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.708191 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfc019a1-c736-435c-90e3-b3e12b44d304-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bfc019a1-c736-435c-90e3-b3e12b44d304\") " pod="openstack/nova-metadata-0" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.721731 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndk4t\" (UniqueName: \"kubernetes.io/projected/bfc019a1-c736-435c-90e3-b3e12b44d304-kube-api-access-ndk4t\") pod \"nova-metadata-0\" (UID: \"bfc019a1-c736-435c-90e3-b3e12b44d304\") " pod="openstack/nova-metadata-0" Dec 01 20:19:43 crc kubenswrapper[4802]: I1201 20:19:43.814560 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 20:19:44 crc kubenswrapper[4802]: I1201 20:19:44.311903 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 20:19:44 crc kubenswrapper[4802]: W1201 20:19:44.316024 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfc019a1_c736_435c_90e3_b3e12b44d304.slice/crio-572c77941fde3c8cd838e91241165283ccdace473ffefcc6d0889eb5ef51c8d6 WatchSource:0}: Error finding container 572c77941fde3c8cd838e91241165283ccdace473ffefcc6d0889eb5ef51c8d6: Status 404 returned error can't find the container with id 572c77941fde3c8cd838e91241165283ccdace473ffefcc6d0889eb5ef51c8d6 Dec 01 20:19:44 crc kubenswrapper[4802]: I1201 20:19:44.396594 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bfc019a1-c736-435c-90e3-b3e12b44d304","Type":"ContainerStarted","Data":"572c77941fde3c8cd838e91241165283ccdace473ffefcc6d0889eb5ef51c8d6"} Dec 01 20:19:44 crc kubenswrapper[4802]: I1201 20:19:44.734930 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d747e6ae-2882-4d75-b8e3-f76d78a73dfa" path="/var/lib/kubelet/pods/d747e6ae-2882-4d75-b8e3-f76d78a73dfa/volumes" Dec 01 20:19:45 crc kubenswrapper[4802]: I1201 20:19:45.411375 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bfc019a1-c736-435c-90e3-b3e12b44d304","Type":"ContainerStarted","Data":"f967b7321e89653cdfaa7c2ff08ce50f47a13d9b6fb8f991e322e7aa943cfe5a"} Dec 01 20:19:45 crc kubenswrapper[4802]: I1201 20:19:45.411426 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bfc019a1-c736-435c-90e3-b3e12b44d304","Type":"ContainerStarted","Data":"967e7fddfc6684a3d69f1032d8ff09843c339d838e8ac28214b5ea87b4a62833"} Dec 01 20:19:45 crc kubenswrapper[4802]: I1201 20:19:45.447487 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.447464022 podStartE2EDuration="2.447464022s" podCreationTimestamp="2025-12-01 20:19:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:19:45.432238625 +0000 UTC m=+1406.994798286" watchObservedRunningTime="2025-12-01 20:19:45.447464022 +0000 UTC m=+1407.010023663" Dec 01 20:19:46 crc kubenswrapper[4802]: I1201 20:19:46.423090 4802 generic.go:334] "Generic (PLEG): container finished" podID="c58f6525-0009-410b-ba8d-56e7256e9d32" containerID="280ebc12bc9cce6b2e9e70d1c3369d3c6a845c79e2cdc14e43e4779917952ab0" exitCode=0 Dec 01 20:19:46 crc kubenswrapper[4802]: I1201 20:19:46.423253 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-pwkmp" event={"ID":"c58f6525-0009-410b-ba8d-56e7256e9d32","Type":"ContainerDied","Data":"280ebc12bc9cce6b2e9e70d1c3369d3c6a845c79e2cdc14e43e4779917952ab0"} Dec 01 20:19:47 crc kubenswrapper[4802]: I1201 20:19:47.442272 4802 generic.go:334] "Generic (PLEG): container finished" podID="ffbb23b4-8984-4d8f-8301-81c25799727d" containerID="bc1456a0928d6b42f7cc78973ebf42570d5bd634528b094e894c37af799ab3a8" exitCode=0 Dec 01 20:19:47 crc kubenswrapper[4802]: I1201 20:19:47.443340 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-x9d74" event={"ID":"ffbb23b4-8984-4d8f-8301-81c25799727d","Type":"ContainerDied","Data":"bc1456a0928d6b42f7cc78973ebf42570d5bd634528b094e894c37af799ab3a8"} Dec 01 20:19:47 crc kubenswrapper[4802]: I1201 20:19:47.586348 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 20:19:47 crc kubenswrapper[4802]: I1201 20:19:47.586401 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 20:19:47 crc kubenswrapper[4802]: I1201 20:19:47.733361 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 20:19:47 crc kubenswrapper[4802]: I1201 20:19:47.773739 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 20:19:47 crc kubenswrapper[4802]: I1201 20:19:47.801594 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-pwkmp" Dec 01 20:19:47 crc kubenswrapper[4802]: I1201 20:19:47.895001 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c58f6525-0009-410b-ba8d-56e7256e9d32-config-data\") pod \"c58f6525-0009-410b-ba8d-56e7256e9d32\" (UID: \"c58f6525-0009-410b-ba8d-56e7256e9d32\") " Dec 01 20:19:47 crc kubenswrapper[4802]: I1201 20:19:47.895113 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c58f6525-0009-410b-ba8d-56e7256e9d32-combined-ca-bundle\") pod \"c58f6525-0009-410b-ba8d-56e7256e9d32\" (UID: \"c58f6525-0009-410b-ba8d-56e7256e9d32\") " Dec 01 20:19:47 crc kubenswrapper[4802]: I1201 20:19:47.895185 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxm5w\" (UniqueName: \"kubernetes.io/projected/c58f6525-0009-410b-ba8d-56e7256e9d32-kube-api-access-fxm5w\") pod \"c58f6525-0009-410b-ba8d-56e7256e9d32\" (UID: \"c58f6525-0009-410b-ba8d-56e7256e9d32\") " Dec 01 20:19:47 crc kubenswrapper[4802]: I1201 20:19:47.895336 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c58f6525-0009-410b-ba8d-56e7256e9d32-scripts\") pod \"c58f6525-0009-410b-ba8d-56e7256e9d32\" (UID: \"c58f6525-0009-410b-ba8d-56e7256e9d32\") " Dec 01 20:19:47 crc kubenswrapper[4802]: I1201 20:19:47.900835 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c58f6525-0009-410b-ba8d-56e7256e9d32-kube-api-access-fxm5w" (OuterVolumeSpecName: "kube-api-access-fxm5w") pod "c58f6525-0009-410b-ba8d-56e7256e9d32" (UID: "c58f6525-0009-410b-ba8d-56e7256e9d32"). InnerVolumeSpecName "kube-api-access-fxm5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:19:47 crc kubenswrapper[4802]: I1201 20:19:47.905144 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c58f6525-0009-410b-ba8d-56e7256e9d32-scripts" (OuterVolumeSpecName: "scripts") pod "c58f6525-0009-410b-ba8d-56e7256e9d32" (UID: "c58f6525-0009-410b-ba8d-56e7256e9d32"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:47 crc kubenswrapper[4802]: I1201 20:19:47.927766 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c58f6525-0009-410b-ba8d-56e7256e9d32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c58f6525-0009-410b-ba8d-56e7256e9d32" (UID: "c58f6525-0009-410b-ba8d-56e7256e9d32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:47 crc kubenswrapper[4802]: I1201 20:19:47.928722 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-566b5b7845-jptns" Dec 01 20:19:47 crc kubenswrapper[4802]: I1201 20:19:47.940468 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c58f6525-0009-410b-ba8d-56e7256e9d32-config-data" (OuterVolumeSpecName: "config-data") pod "c58f6525-0009-410b-ba8d-56e7256e9d32" (UID: "c58f6525-0009-410b-ba8d-56e7256e9d32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:47 crc kubenswrapper[4802]: I1201 20:19:47.997750 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxm5w\" (UniqueName: \"kubernetes.io/projected/c58f6525-0009-410b-ba8d-56e7256e9d32-kube-api-access-fxm5w\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:47 crc kubenswrapper[4802]: I1201 20:19:47.997781 4802 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c58f6525-0009-410b-ba8d-56e7256e9d32-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:47 crc kubenswrapper[4802]: I1201 20:19:47.997794 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c58f6525-0009-410b-ba8d-56e7256e9d32-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:47 crc kubenswrapper[4802]: I1201 20:19:47.997804 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c58f6525-0009-410b-ba8d-56e7256e9d32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.011651 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-dzvvs"] Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.011887 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d97fcdd8f-dzvvs" podUID="b3fe3132-a73e-4cab-b8a9-437a51d44de4" containerName="dnsmasq-dns" containerID="cri-o://4da5faaaa0b6e4b5a5bd09ce53a310df37b626ebb99a979a3f41439c0aa23564" gracePeriod=10 Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.409939 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-dzvvs" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.457598 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-pwkmp" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.457564 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-pwkmp" event={"ID":"c58f6525-0009-410b-ba8d-56e7256e9d32","Type":"ContainerDied","Data":"3a318180d85119e90879a025d1af35c53ed71772497dfcd6169baa7513873218"} Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.457770 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a318180d85119e90879a025d1af35c53ed71772497dfcd6169baa7513873218" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.470214 4802 generic.go:334] "Generic (PLEG): container finished" podID="b3fe3132-a73e-4cab-b8a9-437a51d44de4" containerID="4da5faaaa0b6e4b5a5bd09ce53a310df37b626ebb99a979a3f41439c0aa23564" exitCode=0 Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.470284 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-dzvvs" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.470313 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-dzvvs" event={"ID":"b3fe3132-a73e-4cab-b8a9-437a51d44de4","Type":"ContainerDied","Data":"4da5faaaa0b6e4b5a5bd09ce53a310df37b626ebb99a979a3f41439c0aa23564"} Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.470370 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-dzvvs" event={"ID":"b3fe3132-a73e-4cab-b8a9-437a51d44de4","Type":"ContainerDied","Data":"d7460f95a4223a0ac423aaa7923b9d43623a2eb25d7ab47818a03b9846854a7a"} Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.470413 4802 scope.go:117] "RemoveContainer" containerID="4da5faaaa0b6e4b5a5bd09ce53a310df37b626ebb99a979a3f41439c0aa23564" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.504434 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3fe3132-a73e-4cab-b8a9-437a51d44de4-ovsdbserver-sb\") pod \"b3fe3132-a73e-4cab-b8a9-437a51d44de4\" (UID: \"b3fe3132-a73e-4cab-b8a9-437a51d44de4\") " Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.504503 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3fe3132-a73e-4cab-b8a9-437a51d44de4-dns-svc\") pod \"b3fe3132-a73e-4cab-b8a9-437a51d44de4\" (UID: \"b3fe3132-a73e-4cab-b8a9-437a51d44de4\") " Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.504535 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjpv2\" (UniqueName: \"kubernetes.io/projected/b3fe3132-a73e-4cab-b8a9-437a51d44de4-kube-api-access-bjpv2\") pod \"b3fe3132-a73e-4cab-b8a9-437a51d44de4\" (UID: \"b3fe3132-a73e-4cab-b8a9-437a51d44de4\") " Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.504573 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3fe3132-a73e-4cab-b8a9-437a51d44de4-config\") pod \"b3fe3132-a73e-4cab-b8a9-437a51d44de4\" (UID: \"b3fe3132-a73e-4cab-b8a9-437a51d44de4\") " Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.504605 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3fe3132-a73e-4cab-b8a9-437a51d44de4-ovsdbserver-nb\") pod \"b3fe3132-a73e-4cab-b8a9-437a51d44de4\" (UID: \"b3fe3132-a73e-4cab-b8a9-437a51d44de4\") " Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.509144 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3fe3132-a73e-4cab-b8a9-437a51d44de4-kube-api-access-bjpv2" (OuterVolumeSpecName: "kube-api-access-bjpv2") pod "b3fe3132-a73e-4cab-b8a9-437a51d44de4" (UID: "b3fe3132-a73e-4cab-b8a9-437a51d44de4"). InnerVolumeSpecName "kube-api-access-bjpv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.532810 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.553837 4802 scope.go:117] "RemoveContainer" containerID="24d659ff52548b0681ea4d7c76fc34e1a180b08510ddf3d76ac98a6e88753690" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.560269 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3fe3132-a73e-4cab-b8a9-437a51d44de4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b3fe3132-a73e-4cab-b8a9-437a51d44de4" (UID: "b3fe3132-a73e-4cab-b8a9-437a51d44de4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.579524 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 20:19:48 crc kubenswrapper[4802]: E1201 20:19:48.580119 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c58f6525-0009-410b-ba8d-56e7256e9d32" containerName="nova-cell1-conductor-db-sync" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.580297 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="c58f6525-0009-410b-ba8d-56e7256e9d32" containerName="nova-cell1-conductor-db-sync" Dec 01 20:19:48 crc kubenswrapper[4802]: E1201 20:19:48.580370 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3fe3132-a73e-4cab-b8a9-437a51d44de4" containerName="init" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.580431 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3fe3132-a73e-4cab-b8a9-437a51d44de4" containerName="init" Dec 01 20:19:48 crc kubenswrapper[4802]: E1201 20:19:48.580485 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3fe3132-a73e-4cab-b8a9-437a51d44de4" containerName="dnsmasq-dns" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.580533 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3fe3132-a73e-4cab-b8a9-437a51d44de4" containerName="dnsmasq-dns" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.580774 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3fe3132-a73e-4cab-b8a9-437a51d44de4" containerName="dnsmasq-dns" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.580848 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="c58f6525-0009-410b-ba8d-56e7256e9d32" containerName="nova-cell1-conductor-db-sync" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.581528 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.586222 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.605242 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3fe3132-a73e-4cab-b8a9-437a51d44de4-config" (OuterVolumeSpecName: "config") pod "b3fe3132-a73e-4cab-b8a9-437a51d44de4" (UID: "b3fe3132-a73e-4cab-b8a9-437a51d44de4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.605700 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhbbh\" (UniqueName: \"kubernetes.io/projected/27620668-7b86-40fb-af4b-0c2524e097a7-kube-api-access-zhbbh\") pod \"nova-cell1-conductor-0\" (UID: \"27620668-7b86-40fb-af4b-0c2524e097a7\") " pod="openstack/nova-cell1-conductor-0" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.605759 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27620668-7b86-40fb-af4b-0c2524e097a7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"27620668-7b86-40fb-af4b-0c2524e097a7\") " pod="openstack/nova-cell1-conductor-0" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.605841 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27620668-7b86-40fb-af4b-0c2524e097a7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"27620668-7b86-40fb-af4b-0c2524e097a7\") " pod="openstack/nova-cell1-conductor-0" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.605897 4802 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3fe3132-a73e-4cab-b8a9-437a51d44de4-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.605908 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjpv2\" (UniqueName: \"kubernetes.io/projected/b3fe3132-a73e-4cab-b8a9-437a51d44de4-kube-api-access-bjpv2\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.605918 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3fe3132-a73e-4cab-b8a9-437a51d44de4-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.611841 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3fe3132-a73e-4cab-b8a9-437a51d44de4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b3fe3132-a73e-4cab-b8a9-437a51d44de4" (UID: "b3fe3132-a73e-4cab-b8a9-437a51d44de4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.623685 4802 scope.go:117] "RemoveContainer" containerID="4da5faaaa0b6e4b5a5bd09ce53a310df37b626ebb99a979a3f41439c0aa23564" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.623972 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 20:19:48 crc kubenswrapper[4802]: E1201 20:19:48.624233 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4da5faaaa0b6e4b5a5bd09ce53a310df37b626ebb99a979a3f41439c0aa23564\": container with ID starting with 4da5faaaa0b6e4b5a5bd09ce53a310df37b626ebb99a979a3f41439c0aa23564 not found: ID does not exist" containerID="4da5faaaa0b6e4b5a5bd09ce53a310df37b626ebb99a979a3f41439c0aa23564" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.624340 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4da5faaaa0b6e4b5a5bd09ce53a310df37b626ebb99a979a3f41439c0aa23564"} err="failed to get container status \"4da5faaaa0b6e4b5a5bd09ce53a310df37b626ebb99a979a3f41439c0aa23564\": rpc error: code = NotFound desc = could not find container \"4da5faaaa0b6e4b5a5bd09ce53a310df37b626ebb99a979a3f41439c0aa23564\": container with ID starting with 4da5faaaa0b6e4b5a5bd09ce53a310df37b626ebb99a979a3f41439c0aa23564 not found: ID does not exist" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.624453 4802 scope.go:117] "RemoveContainer" containerID="24d659ff52548b0681ea4d7c76fc34e1a180b08510ddf3d76ac98a6e88753690" Dec 01 20:19:48 crc kubenswrapper[4802]: E1201 20:19:48.624787 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24d659ff52548b0681ea4d7c76fc34e1a180b08510ddf3d76ac98a6e88753690\": container with ID starting with 24d659ff52548b0681ea4d7c76fc34e1a180b08510ddf3d76ac98a6e88753690 not found: ID does not exist" containerID="24d659ff52548b0681ea4d7c76fc34e1a180b08510ddf3d76ac98a6e88753690" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.635162 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24d659ff52548b0681ea4d7c76fc34e1a180b08510ddf3d76ac98a6e88753690"} err="failed to get container status \"24d659ff52548b0681ea4d7c76fc34e1a180b08510ddf3d76ac98a6e88753690\": rpc error: code = NotFound desc = could not find container \"24d659ff52548b0681ea4d7c76fc34e1a180b08510ddf3d76ac98a6e88753690\": container with ID starting with 24d659ff52548b0681ea4d7c76fc34e1a180b08510ddf3d76ac98a6e88753690 not found: ID does not exist" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.625875 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3fe3132-a73e-4cab-b8a9-437a51d44de4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b3fe3132-a73e-4cab-b8a9-437a51d44de4" (UID: "b3fe3132-a73e-4cab-b8a9-437a51d44de4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.669703 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bb81d2d8-c2ae-4d47-9982-39f5f741ea34" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.167:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.669961 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bb81d2d8-c2ae-4d47-9982-39f5f741ea34" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.167:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.707632 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27620668-7b86-40fb-af4b-0c2524e097a7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"27620668-7b86-40fb-af4b-0c2524e097a7\") " pod="openstack/nova-cell1-conductor-0" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.707700 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhbbh\" (UniqueName: \"kubernetes.io/projected/27620668-7b86-40fb-af4b-0c2524e097a7-kube-api-access-zhbbh\") pod \"nova-cell1-conductor-0\" (UID: \"27620668-7b86-40fb-af4b-0c2524e097a7\") " pod="openstack/nova-cell1-conductor-0" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.707784 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27620668-7b86-40fb-af4b-0c2524e097a7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"27620668-7b86-40fb-af4b-0c2524e097a7\") " pod="openstack/nova-cell1-conductor-0" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.707899 4802 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3fe3132-a73e-4cab-b8a9-437a51d44de4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.707915 4802 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3fe3132-a73e-4cab-b8a9-437a51d44de4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.713186 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27620668-7b86-40fb-af4b-0c2524e097a7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"27620668-7b86-40fb-af4b-0c2524e097a7\") " pod="openstack/nova-cell1-conductor-0" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.717945 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27620668-7b86-40fb-af4b-0c2524e097a7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"27620668-7b86-40fb-af4b-0c2524e097a7\") " pod="openstack/nova-cell1-conductor-0" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.747586 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhbbh\" (UniqueName: \"kubernetes.io/projected/27620668-7b86-40fb-af4b-0c2524e097a7-kube-api-access-zhbbh\") pod \"nova-cell1-conductor-0\" (UID: \"27620668-7b86-40fb-af4b-0c2524e097a7\") " pod="openstack/nova-cell1-conductor-0" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.814904 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.815179 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.894124 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-x9d74" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.906400 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-dzvvs"] Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.910585 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfnw7\" (UniqueName: \"kubernetes.io/projected/ffbb23b4-8984-4d8f-8301-81c25799727d-kube-api-access-qfnw7\") pod \"ffbb23b4-8984-4d8f-8301-81c25799727d\" (UID: \"ffbb23b4-8984-4d8f-8301-81c25799727d\") " Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.910846 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffbb23b4-8984-4d8f-8301-81c25799727d-scripts\") pod \"ffbb23b4-8984-4d8f-8301-81c25799727d\" (UID: \"ffbb23b4-8984-4d8f-8301-81c25799727d\") " Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.911045 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffbb23b4-8984-4d8f-8301-81c25799727d-config-data\") pod \"ffbb23b4-8984-4d8f-8301-81c25799727d\" (UID: \"ffbb23b4-8984-4d8f-8301-81c25799727d\") " Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.911187 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffbb23b4-8984-4d8f-8301-81c25799727d-combined-ca-bundle\") pod \"ffbb23b4-8984-4d8f-8301-81c25799727d\" (UID: \"ffbb23b4-8984-4d8f-8301-81c25799727d\") " Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.917790 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.918406 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffbb23b4-8984-4d8f-8301-81c25799727d-kube-api-access-qfnw7" (OuterVolumeSpecName: "kube-api-access-qfnw7") pod "ffbb23b4-8984-4d8f-8301-81c25799727d" (UID: "ffbb23b4-8984-4d8f-8301-81c25799727d"). InnerVolumeSpecName "kube-api-access-qfnw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.918454 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-dzvvs"] Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.926124 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffbb23b4-8984-4d8f-8301-81c25799727d-scripts" (OuterVolumeSpecName: "scripts") pod "ffbb23b4-8984-4d8f-8301-81c25799727d" (UID: "ffbb23b4-8984-4d8f-8301-81c25799727d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.950232 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffbb23b4-8984-4d8f-8301-81c25799727d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ffbb23b4-8984-4d8f-8301-81c25799727d" (UID: "ffbb23b4-8984-4d8f-8301-81c25799727d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:48 crc kubenswrapper[4802]: I1201 20:19:48.977270 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffbb23b4-8984-4d8f-8301-81c25799727d-config-data" (OuterVolumeSpecName: "config-data") pod "ffbb23b4-8984-4d8f-8301-81c25799727d" (UID: "ffbb23b4-8984-4d8f-8301-81c25799727d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:49 crc kubenswrapper[4802]: I1201 20:19:49.015246 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffbb23b4-8984-4d8f-8301-81c25799727d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:49 crc kubenswrapper[4802]: I1201 20:19:49.015486 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfnw7\" (UniqueName: \"kubernetes.io/projected/ffbb23b4-8984-4d8f-8301-81c25799727d-kube-api-access-qfnw7\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:49 crc kubenswrapper[4802]: I1201 20:19:49.015500 4802 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffbb23b4-8984-4d8f-8301-81c25799727d-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:49 crc kubenswrapper[4802]: I1201 20:19:49.015510 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffbb23b4-8984-4d8f-8301-81c25799727d-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:49 crc kubenswrapper[4802]: I1201 20:19:49.397404 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 20:19:49 crc kubenswrapper[4802]: W1201 20:19:49.404359 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27620668_7b86_40fb_af4b_0c2524e097a7.slice/crio-f415886cf28a50c6cae41e81ccfa5d8047860479e2910508db622c78208d53d7 WatchSource:0}: Error finding container f415886cf28a50c6cae41e81ccfa5d8047860479e2910508db622c78208d53d7: Status 404 returned error can't find the container with id f415886cf28a50c6cae41e81ccfa5d8047860479e2910508db622c78208d53d7 Dec 01 20:19:49 crc kubenswrapper[4802]: I1201 20:19:49.483977 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-x9d74" event={"ID":"ffbb23b4-8984-4d8f-8301-81c25799727d","Type":"ContainerDied","Data":"08a05b677b3a2b66d82cf91ce300d9f7fbf9df5e4138d8f2142b047d64c6b3ac"} Dec 01 20:19:49 crc kubenswrapper[4802]: I1201 20:19:49.484421 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08a05b677b3a2b66d82cf91ce300d9f7fbf9df5e4138d8f2142b047d64c6b3ac" Dec 01 20:19:49 crc kubenswrapper[4802]: I1201 20:19:49.484548 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-x9d74" Dec 01 20:19:49 crc kubenswrapper[4802]: I1201 20:19:49.488041 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"27620668-7b86-40fb-af4b-0c2524e097a7","Type":"ContainerStarted","Data":"f415886cf28a50c6cae41e81ccfa5d8047860479e2910508db622c78208d53d7"} Dec 01 20:19:49 crc kubenswrapper[4802]: I1201 20:19:49.674097 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 20:19:49 crc kubenswrapper[4802]: I1201 20:19:49.674330 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bb81d2d8-c2ae-4d47-9982-39f5f741ea34" containerName="nova-api-log" containerID="cri-o://de0cee10a375b83b7be675e7e77a0cdc76f28973acb0d1866a2d079b7973061b" gracePeriod=30 Dec 01 20:19:49 crc kubenswrapper[4802]: I1201 20:19:49.674426 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bb81d2d8-c2ae-4d47-9982-39f5f741ea34" containerName="nova-api-api" containerID="cri-o://640c446f6f44f751c1b5c4a7ed448d996d7c1cc259a9044544eb481b5053a51a" gracePeriod=30 Dec 01 20:19:49 crc kubenswrapper[4802]: I1201 20:19:49.705602 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 20:19:49 crc kubenswrapper[4802]: I1201 20:19:49.726843 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 20:19:49 crc kubenswrapper[4802]: I1201 20:19:49.727302 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bfc019a1-c736-435c-90e3-b3e12b44d304" containerName="nova-metadata-log" containerID="cri-o://967e7fddfc6684a3d69f1032d8ff09843c339d838e8ac28214b5ea87b4a62833" gracePeriod=30 Dec 01 20:19:49 crc kubenswrapper[4802]: I1201 20:19:49.727408 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bfc019a1-c736-435c-90e3-b3e12b44d304" containerName="nova-metadata-metadata" containerID="cri-o://f967b7321e89653cdfaa7c2ff08ce50f47a13d9b6fb8f991e322e7aa943cfe5a" gracePeriod=30 Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.255444 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.347744 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfc019a1-c736-435c-90e3-b3e12b44d304-nova-metadata-tls-certs\") pod \"bfc019a1-c736-435c-90e3-b3e12b44d304\" (UID: \"bfc019a1-c736-435c-90e3-b3e12b44d304\") " Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.348072 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfc019a1-c736-435c-90e3-b3e12b44d304-config-data\") pod \"bfc019a1-c736-435c-90e3-b3e12b44d304\" (UID: \"bfc019a1-c736-435c-90e3-b3e12b44d304\") " Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.348127 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfc019a1-c736-435c-90e3-b3e12b44d304-logs\") pod \"bfc019a1-c736-435c-90e3-b3e12b44d304\" (UID: \"bfc019a1-c736-435c-90e3-b3e12b44d304\") " Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.348167 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc019a1-c736-435c-90e3-b3e12b44d304-combined-ca-bundle\") pod \"bfc019a1-c736-435c-90e3-b3e12b44d304\" (UID: \"bfc019a1-c736-435c-90e3-b3e12b44d304\") " Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.348467 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfc019a1-c736-435c-90e3-b3e12b44d304-logs" (OuterVolumeSpecName: "logs") pod "bfc019a1-c736-435c-90e3-b3e12b44d304" (UID: "bfc019a1-c736-435c-90e3-b3e12b44d304"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.349222 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndk4t\" (UniqueName: \"kubernetes.io/projected/bfc019a1-c736-435c-90e3-b3e12b44d304-kube-api-access-ndk4t\") pod \"bfc019a1-c736-435c-90e3-b3e12b44d304\" (UID: \"bfc019a1-c736-435c-90e3-b3e12b44d304\") " Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.349796 4802 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfc019a1-c736-435c-90e3-b3e12b44d304-logs\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.357455 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfc019a1-c736-435c-90e3-b3e12b44d304-kube-api-access-ndk4t" (OuterVolumeSpecName: "kube-api-access-ndk4t") pod "bfc019a1-c736-435c-90e3-b3e12b44d304" (UID: "bfc019a1-c736-435c-90e3-b3e12b44d304"). InnerVolumeSpecName "kube-api-access-ndk4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.383996 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfc019a1-c736-435c-90e3-b3e12b44d304-config-data" (OuterVolumeSpecName: "config-data") pod "bfc019a1-c736-435c-90e3-b3e12b44d304" (UID: "bfc019a1-c736-435c-90e3-b3e12b44d304"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.389827 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfc019a1-c736-435c-90e3-b3e12b44d304-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfc019a1-c736-435c-90e3-b3e12b44d304" (UID: "bfc019a1-c736-435c-90e3-b3e12b44d304"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.409222 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfc019a1-c736-435c-90e3-b3e12b44d304-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "bfc019a1-c736-435c-90e3-b3e12b44d304" (UID: "bfc019a1-c736-435c-90e3-b3e12b44d304"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.417178 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.453511 4802 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfc019a1-c736-435c-90e3-b3e12b44d304-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.453549 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfc019a1-c736-435c-90e3-b3e12b44d304-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.453561 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc019a1-c736-435c-90e3-b3e12b44d304-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.453572 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndk4t\" (UniqueName: \"kubernetes.io/projected/bfc019a1-c736-435c-90e3-b3e12b44d304-kube-api-access-ndk4t\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.501598 4802 generic.go:334] "Generic (PLEG): container finished" podID="bfc019a1-c736-435c-90e3-b3e12b44d304" containerID="f967b7321e89653cdfaa7c2ff08ce50f47a13d9b6fb8f991e322e7aa943cfe5a" exitCode=0 Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.501638 4802 generic.go:334] "Generic (PLEG): container finished" podID="bfc019a1-c736-435c-90e3-b3e12b44d304" containerID="967e7fddfc6684a3d69f1032d8ff09843c339d838e8ac28214b5ea87b4a62833" exitCode=143 Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.501666 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.501724 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bfc019a1-c736-435c-90e3-b3e12b44d304","Type":"ContainerDied","Data":"f967b7321e89653cdfaa7c2ff08ce50f47a13d9b6fb8f991e322e7aa943cfe5a"} Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.501788 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bfc019a1-c736-435c-90e3-b3e12b44d304","Type":"ContainerDied","Data":"967e7fddfc6684a3d69f1032d8ff09843c339d838e8ac28214b5ea87b4a62833"} Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.501803 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bfc019a1-c736-435c-90e3-b3e12b44d304","Type":"ContainerDied","Data":"572c77941fde3c8cd838e91241165283ccdace473ffefcc6d0889eb5ef51c8d6"} Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.501841 4802 scope.go:117] "RemoveContainer" containerID="f967b7321e89653cdfaa7c2ff08ce50f47a13d9b6fb8f991e322e7aa943cfe5a" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.504220 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"27620668-7b86-40fb-af4b-0c2524e097a7","Type":"ContainerStarted","Data":"ac5959ab28cfb0ad50142faa32387e68551238a622a8a2c3c9d7becbeefa6360"} Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.505364 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.507896 4802 generic.go:334] "Generic (PLEG): container finished" podID="bb81d2d8-c2ae-4d47-9982-39f5f741ea34" containerID="de0cee10a375b83b7be675e7e77a0cdc76f28973acb0d1866a2d079b7973061b" exitCode=143 Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.508047 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f" containerName="nova-scheduler-scheduler" containerID="cri-o://1e7709c523496aefd7526add378be71075823eb5ad96c6c9c022c9da0856e3a6" gracePeriod=30 Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.508323 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bb81d2d8-c2ae-4d47-9982-39f5f741ea34","Type":"ContainerDied","Data":"de0cee10a375b83b7be675e7e77a0cdc76f28973acb0d1866a2d079b7973061b"} Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.533058 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.533041407 podStartE2EDuration="2.533041407s" podCreationTimestamp="2025-12-01 20:19:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:19:50.522462284 +0000 UTC m=+1412.085021935" watchObservedRunningTime="2025-12-01 20:19:50.533041407 +0000 UTC m=+1412.095601048" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.540816 4802 scope.go:117] "RemoveContainer" containerID="967e7fddfc6684a3d69f1032d8ff09843c339d838e8ac28214b5ea87b4a62833" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.551380 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.562693 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.576355 4802 scope.go:117] "RemoveContainer" containerID="f967b7321e89653cdfaa7c2ff08ce50f47a13d9b6fb8f991e322e7aa943cfe5a" Dec 01 20:19:50 crc kubenswrapper[4802]: E1201 20:19:50.580558 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f967b7321e89653cdfaa7c2ff08ce50f47a13d9b6fb8f991e322e7aa943cfe5a\": container with ID starting with f967b7321e89653cdfaa7c2ff08ce50f47a13d9b6fb8f991e322e7aa943cfe5a not found: ID does not exist" containerID="f967b7321e89653cdfaa7c2ff08ce50f47a13d9b6fb8f991e322e7aa943cfe5a" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.580594 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f967b7321e89653cdfaa7c2ff08ce50f47a13d9b6fb8f991e322e7aa943cfe5a"} err="failed to get container status \"f967b7321e89653cdfaa7c2ff08ce50f47a13d9b6fb8f991e322e7aa943cfe5a\": rpc error: code = NotFound desc = could not find container \"f967b7321e89653cdfaa7c2ff08ce50f47a13d9b6fb8f991e322e7aa943cfe5a\": container with ID starting with f967b7321e89653cdfaa7c2ff08ce50f47a13d9b6fb8f991e322e7aa943cfe5a not found: ID does not exist" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.580614 4802 scope.go:117] "RemoveContainer" containerID="967e7fddfc6684a3d69f1032d8ff09843c339d838e8ac28214b5ea87b4a62833" Dec 01 20:19:50 crc kubenswrapper[4802]: E1201 20:19:50.585398 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"967e7fddfc6684a3d69f1032d8ff09843c339d838e8ac28214b5ea87b4a62833\": container with ID starting with 967e7fddfc6684a3d69f1032d8ff09843c339d838e8ac28214b5ea87b4a62833 not found: ID does not exist" containerID="967e7fddfc6684a3d69f1032d8ff09843c339d838e8ac28214b5ea87b4a62833" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.585454 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"967e7fddfc6684a3d69f1032d8ff09843c339d838e8ac28214b5ea87b4a62833"} err="failed to get container status \"967e7fddfc6684a3d69f1032d8ff09843c339d838e8ac28214b5ea87b4a62833\": rpc error: code = NotFound desc = could not find container \"967e7fddfc6684a3d69f1032d8ff09843c339d838e8ac28214b5ea87b4a62833\": container with ID starting with 967e7fddfc6684a3d69f1032d8ff09843c339d838e8ac28214b5ea87b4a62833 not found: ID does not exist" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.585486 4802 scope.go:117] "RemoveContainer" containerID="f967b7321e89653cdfaa7c2ff08ce50f47a13d9b6fb8f991e322e7aa943cfe5a" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.585997 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f967b7321e89653cdfaa7c2ff08ce50f47a13d9b6fb8f991e322e7aa943cfe5a"} err="failed to get container status \"f967b7321e89653cdfaa7c2ff08ce50f47a13d9b6fb8f991e322e7aa943cfe5a\": rpc error: code = NotFound desc = could not find container \"f967b7321e89653cdfaa7c2ff08ce50f47a13d9b6fb8f991e322e7aa943cfe5a\": container with ID starting with f967b7321e89653cdfaa7c2ff08ce50f47a13d9b6fb8f991e322e7aa943cfe5a not found: ID does not exist" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.586017 4802 scope.go:117] "RemoveContainer" containerID="967e7fddfc6684a3d69f1032d8ff09843c339d838e8ac28214b5ea87b4a62833" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.586634 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"967e7fddfc6684a3d69f1032d8ff09843c339d838e8ac28214b5ea87b4a62833"} err="failed to get container status \"967e7fddfc6684a3d69f1032d8ff09843c339d838e8ac28214b5ea87b4a62833\": rpc error: code = NotFound desc = could not find container \"967e7fddfc6684a3d69f1032d8ff09843c339d838e8ac28214b5ea87b4a62833\": container with ID starting with 967e7fddfc6684a3d69f1032d8ff09843c339d838e8ac28214b5ea87b4a62833 not found: ID does not exist" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.594767 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 20:19:50 crc kubenswrapper[4802]: E1201 20:19:50.595183 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffbb23b4-8984-4d8f-8301-81c25799727d" containerName="nova-manage" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.595221 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffbb23b4-8984-4d8f-8301-81c25799727d" containerName="nova-manage" Dec 01 20:19:50 crc kubenswrapper[4802]: E1201 20:19:50.595234 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfc019a1-c736-435c-90e3-b3e12b44d304" containerName="nova-metadata-metadata" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.595242 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfc019a1-c736-435c-90e3-b3e12b44d304" containerName="nova-metadata-metadata" Dec 01 20:19:50 crc kubenswrapper[4802]: E1201 20:19:50.595266 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfc019a1-c736-435c-90e3-b3e12b44d304" containerName="nova-metadata-log" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.595273 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfc019a1-c736-435c-90e3-b3e12b44d304" containerName="nova-metadata-log" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.595512 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffbb23b4-8984-4d8f-8301-81c25799727d" containerName="nova-manage" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.595533 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfc019a1-c736-435c-90e3-b3e12b44d304" containerName="nova-metadata-metadata" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.595544 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfc019a1-c736-435c-90e3-b3e12b44d304" containerName="nova-metadata-log" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.596699 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.602419 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.602445 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.613779 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.656955 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdvbs\" (UniqueName: \"kubernetes.io/projected/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-kube-api-access-pdvbs\") pod \"nova-metadata-0\" (UID: \"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6\") " pod="openstack/nova-metadata-0" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.657019 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-config-data\") pod \"nova-metadata-0\" (UID: \"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6\") " pod="openstack/nova-metadata-0" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.657092 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6\") " pod="openstack/nova-metadata-0" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.657147 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6\") " pod="openstack/nova-metadata-0" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.657178 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-logs\") pod \"nova-metadata-0\" (UID: \"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6\") " pod="openstack/nova-metadata-0" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.742858 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3fe3132-a73e-4cab-b8a9-437a51d44de4" path="/var/lib/kubelet/pods/b3fe3132-a73e-4cab-b8a9-437a51d44de4/volumes" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.743612 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfc019a1-c736-435c-90e3-b3e12b44d304" path="/var/lib/kubelet/pods/bfc019a1-c736-435c-90e3-b3e12b44d304/volumes" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.758599 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6\") " pod="openstack/nova-metadata-0" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.758659 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-logs\") pod \"nova-metadata-0\" (UID: \"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6\") " pod="openstack/nova-metadata-0" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.758787 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdvbs\" (UniqueName: \"kubernetes.io/projected/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-kube-api-access-pdvbs\") pod \"nova-metadata-0\" (UID: \"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6\") " pod="openstack/nova-metadata-0" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.758840 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-config-data\") pod \"nova-metadata-0\" (UID: \"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6\") " pod="openstack/nova-metadata-0" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.758960 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6\") " pod="openstack/nova-metadata-0" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.759811 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-logs\") pod \"nova-metadata-0\" (UID: \"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6\") " pod="openstack/nova-metadata-0" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.762978 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6\") " pod="openstack/nova-metadata-0" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.763325 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6\") " pod="openstack/nova-metadata-0" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.764979 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-config-data\") pod \"nova-metadata-0\" (UID: \"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6\") " pod="openstack/nova-metadata-0" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.777660 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdvbs\" (UniqueName: \"kubernetes.io/projected/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-kube-api-access-pdvbs\") pod \"nova-metadata-0\" (UID: \"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6\") " pod="openstack/nova-metadata-0" Dec 01 20:19:50 crc kubenswrapper[4802]: I1201 20:19:50.921023 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 20:19:51 crc kubenswrapper[4802]: I1201 20:19:51.385128 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 20:19:51 crc kubenswrapper[4802]: W1201 20:19:51.386953 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f6e2953_0b0e_4e75_b450_03e3faf4bdf6.slice/crio-12690994fd94f6f3c419a26a552cbb8437cd674211387ff5519122207513ce73 WatchSource:0}: Error finding container 12690994fd94f6f3c419a26a552cbb8437cd674211387ff5519122207513ce73: Status 404 returned error can't find the container with id 12690994fd94f6f3c419a26a552cbb8437cd674211387ff5519122207513ce73 Dec 01 20:19:51 crc kubenswrapper[4802]: I1201 20:19:51.521118 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6","Type":"ContainerStarted","Data":"12690994fd94f6f3c419a26a552cbb8437cd674211387ff5519122207513ce73"} Dec 01 20:19:52 crc kubenswrapper[4802]: I1201 20:19:52.602505 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6","Type":"ContainerStarted","Data":"9f274b5d6b00a397f160480aac98e75341f613bd21121492dad893dfae438cd4"} Dec 01 20:19:52 crc kubenswrapper[4802]: I1201 20:19:52.602584 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6","Type":"ContainerStarted","Data":"f02630047b4c7ca6c9c3803a4106e275e1745e409bb993ce1a1fcb2ebebc2f34"} Dec 01 20:19:52 crc kubenswrapper[4802]: I1201 20:19:52.640214 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.6401706860000003 podStartE2EDuration="2.640170686s" podCreationTimestamp="2025-12-01 20:19:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:19:52.632129904 +0000 UTC m=+1414.194689565" watchObservedRunningTime="2025-12-01 20:19:52.640170686 +0000 UTC m=+1414.202730337" Dec 01 20:19:52 crc kubenswrapper[4802]: E1201 20:19:52.736984 4802 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1e7709c523496aefd7526add378be71075823eb5ad96c6c9c022c9da0856e3a6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 20:19:52 crc kubenswrapper[4802]: E1201 20:19:52.739286 4802 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1e7709c523496aefd7526add378be71075823eb5ad96c6c9c022c9da0856e3a6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 20:19:52 crc kubenswrapper[4802]: E1201 20:19:52.741416 4802 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1e7709c523496aefd7526add378be71075823eb5ad96c6c9c022c9da0856e3a6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 20:19:52 crc kubenswrapper[4802]: E1201 20:19:52.741469 4802 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f" containerName="nova-scheduler-scheduler" Dec 01 20:19:52 crc kubenswrapper[4802]: I1201 20:19:52.769757 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 20:19:52 crc kubenswrapper[4802]: I1201 20:19:52.770030 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="a6a6f4e1-7593-427b-b430-42c7b351e652" containerName="kube-state-metrics" containerID="cri-o://aac8775d6558175e6be41fa3d1e1780d3fc548d1fbafac8b04704cd3b52fbd36" gracePeriod=30 Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.220831 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.304456 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hdx6\" (UniqueName: \"kubernetes.io/projected/a6a6f4e1-7593-427b-b430-42c7b351e652-kube-api-access-2hdx6\") pod \"a6a6f4e1-7593-427b-b430-42c7b351e652\" (UID: \"a6a6f4e1-7593-427b-b430-42c7b351e652\") " Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.310544 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6a6f4e1-7593-427b-b430-42c7b351e652-kube-api-access-2hdx6" (OuterVolumeSpecName: "kube-api-access-2hdx6") pod "a6a6f4e1-7593-427b-b430-42c7b351e652" (UID: "a6a6f4e1-7593-427b-b430-42c7b351e652"). InnerVolumeSpecName "kube-api-access-2hdx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.407438 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hdx6\" (UniqueName: \"kubernetes.io/projected/a6a6f4e1-7593-427b-b430-42c7b351e652-kube-api-access-2hdx6\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.568909 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.611247 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngq68\" (UniqueName: \"kubernetes.io/projected/86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f-kube-api-access-ngq68\") pod \"86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f\" (UID: \"86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f\") " Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.611696 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f-config-data\") pod \"86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f\" (UID: \"86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f\") " Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.611842 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f-combined-ca-bundle\") pod \"86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f\" (UID: \"86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f\") " Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.619529 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f-kube-api-access-ngq68" (OuterVolumeSpecName: "kube-api-access-ngq68") pod "86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f" (UID: "86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f"). InnerVolumeSpecName "kube-api-access-ngq68". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.628242 4802 generic.go:334] "Generic (PLEG): container finished" podID="a6a6f4e1-7593-427b-b430-42c7b351e652" containerID="aac8775d6558175e6be41fa3d1e1780d3fc548d1fbafac8b04704cd3b52fbd36" exitCode=2 Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.628318 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.628335 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a6a6f4e1-7593-427b-b430-42c7b351e652","Type":"ContainerDied","Data":"aac8775d6558175e6be41fa3d1e1780d3fc548d1fbafac8b04704cd3b52fbd36"} Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.628383 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a6a6f4e1-7593-427b-b430-42c7b351e652","Type":"ContainerDied","Data":"b08d28cf818079fcd286d9cd40867d647ea9b59711c606f1c8313dbc5d75ea73"} Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.628404 4802 scope.go:117] "RemoveContainer" containerID="aac8775d6558175e6be41fa3d1e1780d3fc548d1fbafac8b04704cd3b52fbd36" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.631798 4802 generic.go:334] "Generic (PLEG): container finished" podID="86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f" containerID="1e7709c523496aefd7526add378be71075823eb5ad96c6c9c022c9da0856e3a6" exitCode=0 Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.631853 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.631907 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f","Type":"ContainerDied","Data":"1e7709c523496aefd7526add378be71075823eb5ad96c6c9c022c9da0856e3a6"} Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.631935 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f","Type":"ContainerDied","Data":"5af0820e5f322d212e899dd79eb7d3dfd2f074f5a35ac70aab836d919cfa58c4"} Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.654426 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f" (UID: "86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.655996 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f-config-data" (OuterVolumeSpecName: "config-data") pod "86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f" (UID: "86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.675345 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.678529 4802 scope.go:117] "RemoveContainer" containerID="aac8775d6558175e6be41fa3d1e1780d3fc548d1fbafac8b04704cd3b52fbd36" Dec 01 20:19:53 crc kubenswrapper[4802]: E1201 20:19:53.680022 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aac8775d6558175e6be41fa3d1e1780d3fc548d1fbafac8b04704cd3b52fbd36\": container with ID starting with aac8775d6558175e6be41fa3d1e1780d3fc548d1fbafac8b04704cd3b52fbd36 not found: ID does not exist" containerID="aac8775d6558175e6be41fa3d1e1780d3fc548d1fbafac8b04704cd3b52fbd36" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.680062 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aac8775d6558175e6be41fa3d1e1780d3fc548d1fbafac8b04704cd3b52fbd36"} err="failed to get container status \"aac8775d6558175e6be41fa3d1e1780d3fc548d1fbafac8b04704cd3b52fbd36\": rpc error: code = NotFound desc = could not find container \"aac8775d6558175e6be41fa3d1e1780d3fc548d1fbafac8b04704cd3b52fbd36\": container with ID starting with aac8775d6558175e6be41fa3d1e1780d3fc548d1fbafac8b04704cd3b52fbd36 not found: ID does not exist" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.680088 4802 scope.go:117] "RemoveContainer" containerID="1e7709c523496aefd7526add378be71075823eb5ad96c6c9c022c9da0856e3a6" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.684225 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.692322 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 20:19:53 crc kubenswrapper[4802]: E1201 20:19:53.692858 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f" containerName="nova-scheduler-scheduler" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.692879 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f" containerName="nova-scheduler-scheduler" Dec 01 20:19:53 crc kubenswrapper[4802]: E1201 20:19:53.692922 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6a6f4e1-7593-427b-b430-42c7b351e652" containerName="kube-state-metrics" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.692930 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6a6f4e1-7593-427b-b430-42c7b351e652" containerName="kube-state-metrics" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.693097 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6a6f4e1-7593-427b-b430-42c7b351e652" containerName="kube-state-metrics" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.693117 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f" containerName="nova-scheduler-scheduler" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.693980 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.696359 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.696596 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.699526 4802 scope.go:117] "RemoveContainer" containerID="1e7709c523496aefd7526add378be71075823eb5ad96c6c9c022c9da0856e3a6" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.700066 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 20:19:53 crc kubenswrapper[4802]: E1201 20:19:53.701614 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e7709c523496aefd7526add378be71075823eb5ad96c6c9c022c9da0856e3a6\": container with ID starting with 1e7709c523496aefd7526add378be71075823eb5ad96c6c9c022c9da0856e3a6 not found: ID does not exist" containerID="1e7709c523496aefd7526add378be71075823eb5ad96c6c9c022c9da0856e3a6" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.701650 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e7709c523496aefd7526add378be71075823eb5ad96c6c9c022c9da0856e3a6"} err="failed to get container status \"1e7709c523496aefd7526add378be71075823eb5ad96c6c9c022c9da0856e3a6\": rpc error: code = NotFound desc = could not find container \"1e7709c523496aefd7526add378be71075823eb5ad96c6c9c022c9da0856e3a6\": container with ID starting with 1e7709c523496aefd7526add378be71075823eb5ad96c6c9c022c9da0856e3a6 not found: ID does not exist" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.713758 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3358c4e8-0931-4d2e-82d6-527c54f3537c-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"3358c4e8-0931-4d2e-82d6-527c54f3537c\") " pod="openstack/kube-state-metrics-0" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.713999 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/3358c4e8-0931-4d2e-82d6-527c54f3537c-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"3358c4e8-0931-4d2e-82d6-527c54f3537c\") " pod="openstack/kube-state-metrics-0" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.714219 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/3358c4e8-0931-4d2e-82d6-527c54f3537c-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"3358c4e8-0931-4d2e-82d6-527c54f3537c\") " pod="openstack/kube-state-metrics-0" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.714474 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgnl5\" (UniqueName: \"kubernetes.io/projected/3358c4e8-0931-4d2e-82d6-527c54f3537c-kube-api-access-jgnl5\") pod \"kube-state-metrics-0\" (UID: \"3358c4e8-0931-4d2e-82d6-527c54f3537c\") " pod="openstack/kube-state-metrics-0" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.714733 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.714826 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.714892 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngq68\" (UniqueName: \"kubernetes.io/projected/86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f-kube-api-access-ngq68\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.815790 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgnl5\" (UniqueName: \"kubernetes.io/projected/3358c4e8-0931-4d2e-82d6-527c54f3537c-kube-api-access-jgnl5\") pod \"kube-state-metrics-0\" (UID: \"3358c4e8-0931-4d2e-82d6-527c54f3537c\") " pod="openstack/kube-state-metrics-0" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.815921 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3358c4e8-0931-4d2e-82d6-527c54f3537c-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"3358c4e8-0931-4d2e-82d6-527c54f3537c\") " pod="openstack/kube-state-metrics-0" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.815952 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/3358c4e8-0931-4d2e-82d6-527c54f3537c-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"3358c4e8-0931-4d2e-82d6-527c54f3537c\") " pod="openstack/kube-state-metrics-0" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.815995 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/3358c4e8-0931-4d2e-82d6-527c54f3537c-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"3358c4e8-0931-4d2e-82d6-527c54f3537c\") " pod="openstack/kube-state-metrics-0" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.822043 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/3358c4e8-0931-4d2e-82d6-527c54f3537c-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"3358c4e8-0931-4d2e-82d6-527c54f3537c\") " pod="openstack/kube-state-metrics-0" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.823476 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/3358c4e8-0931-4d2e-82d6-527c54f3537c-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"3358c4e8-0931-4d2e-82d6-527c54f3537c\") " pod="openstack/kube-state-metrics-0" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.826861 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3358c4e8-0931-4d2e-82d6-527c54f3537c-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"3358c4e8-0931-4d2e-82d6-527c54f3537c\") " pod="openstack/kube-state-metrics-0" Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.828002 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.828311 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fcc16b80-7583-4c97-a61c-da50c8c5750c" containerName="ceilometer-central-agent" containerID="cri-o://1d68e00244dd60b3281ef276f7771d02a85bce38a5d587b1c5adc76188f8a4ad" gracePeriod=30 Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.828460 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fcc16b80-7583-4c97-a61c-da50c8c5750c" containerName="proxy-httpd" containerID="cri-o://90d3bff4233d10c687c7ed41b3773a0fa73d8e85b71b8d1d69b110c6bde5b439" gracePeriod=30 Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.828499 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fcc16b80-7583-4c97-a61c-da50c8c5750c" containerName="sg-core" containerID="cri-o://b993b23d61c4539ddb65c00d7e036a611b09afa4a0f1abb493c8ce447b1e4719" gracePeriod=30 Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.828537 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fcc16b80-7583-4c97-a61c-da50c8c5750c" containerName="ceilometer-notification-agent" containerID="cri-o://81f61150219c9141ca13739b67aa8f97e7171957427aec233c9de14dc3432b3f" gracePeriod=30 Dec 01 20:19:53 crc kubenswrapper[4802]: I1201 20:19:53.845930 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgnl5\" (UniqueName: \"kubernetes.io/projected/3358c4e8-0931-4d2e-82d6-527c54f3537c-kube-api-access-jgnl5\") pod \"kube-state-metrics-0\" (UID: \"3358c4e8-0931-4d2e-82d6-527c54f3537c\") " pod="openstack/kube-state-metrics-0" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.015610 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.021391 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.031147 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.039725 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.041443 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.043918 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.051911 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.121145 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30203d04-83ed-42d6-9f9e-e7db08604e70-config-data\") pod \"nova-scheduler-0\" (UID: \"30203d04-83ed-42d6-9f9e-e7db08604e70\") " pod="openstack/nova-scheduler-0" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.121313 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b7rw\" (UniqueName: \"kubernetes.io/projected/30203d04-83ed-42d6-9f9e-e7db08604e70-kube-api-access-5b7rw\") pod \"nova-scheduler-0\" (UID: \"30203d04-83ed-42d6-9f9e-e7db08604e70\") " pod="openstack/nova-scheduler-0" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.121364 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30203d04-83ed-42d6-9f9e-e7db08604e70-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"30203d04-83ed-42d6-9f9e-e7db08604e70\") " pod="openstack/nova-scheduler-0" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.222866 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30203d04-83ed-42d6-9f9e-e7db08604e70-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"30203d04-83ed-42d6-9f9e-e7db08604e70\") " pod="openstack/nova-scheduler-0" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.223306 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30203d04-83ed-42d6-9f9e-e7db08604e70-config-data\") pod \"nova-scheduler-0\" (UID: \"30203d04-83ed-42d6-9f9e-e7db08604e70\") " pod="openstack/nova-scheduler-0" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.223358 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b7rw\" (UniqueName: \"kubernetes.io/projected/30203d04-83ed-42d6-9f9e-e7db08604e70-kube-api-access-5b7rw\") pod \"nova-scheduler-0\" (UID: \"30203d04-83ed-42d6-9f9e-e7db08604e70\") " pod="openstack/nova-scheduler-0" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.229230 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30203d04-83ed-42d6-9f9e-e7db08604e70-config-data\") pod \"nova-scheduler-0\" (UID: \"30203d04-83ed-42d6-9f9e-e7db08604e70\") " pod="openstack/nova-scheduler-0" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.229230 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30203d04-83ed-42d6-9f9e-e7db08604e70-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"30203d04-83ed-42d6-9f9e-e7db08604e70\") " pod="openstack/nova-scheduler-0" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.247624 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b7rw\" (UniqueName: \"kubernetes.io/projected/30203d04-83ed-42d6-9f9e-e7db08604e70-kube-api-access-5b7rw\") pod \"nova-scheduler-0\" (UID: \"30203d04-83ed-42d6-9f9e-e7db08604e70\") " pod="openstack/nova-scheduler-0" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.438344 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.495085 4802 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.510317 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.525391 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.630892 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb81d2d8-c2ae-4d47-9982-39f5f741ea34-combined-ca-bundle\") pod \"bb81d2d8-c2ae-4d47-9982-39f5f741ea34\" (UID: \"bb81d2d8-c2ae-4d47-9982-39f5f741ea34\") " Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.631329 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb81d2d8-c2ae-4d47-9982-39f5f741ea34-logs\") pod \"bb81d2d8-c2ae-4d47-9982-39f5f741ea34\" (UID: \"bb81d2d8-c2ae-4d47-9982-39f5f741ea34\") " Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.631483 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g46wj\" (UniqueName: \"kubernetes.io/projected/bb81d2d8-c2ae-4d47-9982-39f5f741ea34-kube-api-access-g46wj\") pod \"bb81d2d8-c2ae-4d47-9982-39f5f741ea34\" (UID: \"bb81d2d8-c2ae-4d47-9982-39f5f741ea34\") " Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.631529 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb81d2d8-c2ae-4d47-9982-39f5f741ea34-config-data\") pod \"bb81d2d8-c2ae-4d47-9982-39f5f741ea34\" (UID: \"bb81d2d8-c2ae-4d47-9982-39f5f741ea34\") " Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.632614 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb81d2d8-c2ae-4d47-9982-39f5f741ea34-logs" (OuterVolumeSpecName: "logs") pod "bb81d2d8-c2ae-4d47-9982-39f5f741ea34" (UID: "bb81d2d8-c2ae-4d47-9982-39f5f741ea34"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.638190 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb81d2d8-c2ae-4d47-9982-39f5f741ea34-kube-api-access-g46wj" (OuterVolumeSpecName: "kube-api-access-g46wj") pod "bb81d2d8-c2ae-4d47-9982-39f5f741ea34" (UID: "bb81d2d8-c2ae-4d47-9982-39f5f741ea34"). InnerVolumeSpecName "kube-api-access-g46wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.645147 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3358c4e8-0931-4d2e-82d6-527c54f3537c","Type":"ContainerStarted","Data":"7fb45e53b48263fac692b3ccf55bc377d8f94f775ac44120a41c0dfc0f509d37"} Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.653522 4802 generic.go:334] "Generic (PLEG): container finished" podID="bb81d2d8-c2ae-4d47-9982-39f5f741ea34" containerID="640c446f6f44f751c1b5c4a7ed448d996d7c1cc259a9044544eb481b5053a51a" exitCode=0 Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.654036 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bb81d2d8-c2ae-4d47-9982-39f5f741ea34","Type":"ContainerDied","Data":"640c446f6f44f751c1b5c4a7ed448d996d7c1cc259a9044544eb481b5053a51a"} Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.654089 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bb81d2d8-c2ae-4d47-9982-39f5f741ea34","Type":"ContainerDied","Data":"8e79f5265c065e15098e142f046e6ceff341953befa9aa83ffecf925ccacfdde"} Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.654113 4802 scope.go:117] "RemoveContainer" containerID="640c446f6f44f751c1b5c4a7ed448d996d7c1cc259a9044544eb481b5053a51a" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.654306 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.661504 4802 generic.go:334] "Generic (PLEG): container finished" podID="fcc16b80-7583-4c97-a61c-da50c8c5750c" containerID="90d3bff4233d10c687c7ed41b3773a0fa73d8e85b71b8d1d69b110c6bde5b439" exitCode=0 Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.661539 4802 generic.go:334] "Generic (PLEG): container finished" podID="fcc16b80-7583-4c97-a61c-da50c8c5750c" containerID="b993b23d61c4539ddb65c00d7e036a611b09afa4a0f1abb493c8ce447b1e4719" exitCode=2 Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.661549 4802 generic.go:334] "Generic (PLEG): container finished" podID="fcc16b80-7583-4c97-a61c-da50c8c5750c" containerID="1d68e00244dd60b3281ef276f7771d02a85bce38a5d587b1c5adc76188f8a4ad" exitCode=0 Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.661587 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fcc16b80-7583-4c97-a61c-da50c8c5750c","Type":"ContainerDied","Data":"90d3bff4233d10c687c7ed41b3773a0fa73d8e85b71b8d1d69b110c6bde5b439"} Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.661630 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fcc16b80-7583-4c97-a61c-da50c8c5750c","Type":"ContainerDied","Data":"b993b23d61c4539ddb65c00d7e036a611b09afa4a0f1abb493c8ce447b1e4719"} Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.661643 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fcc16b80-7583-4c97-a61c-da50c8c5750c","Type":"ContainerDied","Data":"1d68e00244dd60b3281ef276f7771d02a85bce38a5d587b1c5adc76188f8a4ad"} Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.666623 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb81d2d8-c2ae-4d47-9982-39f5f741ea34-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb81d2d8-c2ae-4d47-9982-39f5f741ea34" (UID: "bb81d2d8-c2ae-4d47-9982-39f5f741ea34"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.677184 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb81d2d8-c2ae-4d47-9982-39f5f741ea34-config-data" (OuterVolumeSpecName: "config-data") pod "bb81d2d8-c2ae-4d47-9982-39f5f741ea34" (UID: "bb81d2d8-c2ae-4d47-9982-39f5f741ea34"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.687252 4802 scope.go:117] "RemoveContainer" containerID="de0cee10a375b83b7be675e7e77a0cdc76f28973acb0d1866a2d079b7973061b" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.726402 4802 scope.go:117] "RemoveContainer" containerID="640c446f6f44f751c1b5c4a7ed448d996d7c1cc259a9044544eb481b5053a51a" Dec 01 20:19:54 crc kubenswrapper[4802]: E1201 20:19:54.726857 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"640c446f6f44f751c1b5c4a7ed448d996d7c1cc259a9044544eb481b5053a51a\": container with ID starting with 640c446f6f44f751c1b5c4a7ed448d996d7c1cc259a9044544eb481b5053a51a not found: ID does not exist" containerID="640c446f6f44f751c1b5c4a7ed448d996d7c1cc259a9044544eb481b5053a51a" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.726914 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"640c446f6f44f751c1b5c4a7ed448d996d7c1cc259a9044544eb481b5053a51a"} err="failed to get container status \"640c446f6f44f751c1b5c4a7ed448d996d7c1cc259a9044544eb481b5053a51a\": rpc error: code = NotFound desc = could not find container \"640c446f6f44f751c1b5c4a7ed448d996d7c1cc259a9044544eb481b5053a51a\": container with ID starting with 640c446f6f44f751c1b5c4a7ed448d996d7c1cc259a9044544eb481b5053a51a not found: ID does not exist" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.726944 4802 scope.go:117] "RemoveContainer" containerID="de0cee10a375b83b7be675e7e77a0cdc76f28973acb0d1866a2d079b7973061b" Dec 01 20:19:54 crc kubenswrapper[4802]: E1201 20:19:54.727232 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de0cee10a375b83b7be675e7e77a0cdc76f28973acb0d1866a2d079b7973061b\": container with ID starting with de0cee10a375b83b7be675e7e77a0cdc76f28973acb0d1866a2d079b7973061b not found: ID does not exist" containerID="de0cee10a375b83b7be675e7e77a0cdc76f28973acb0d1866a2d079b7973061b" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.727292 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de0cee10a375b83b7be675e7e77a0cdc76f28973acb0d1866a2d079b7973061b"} err="failed to get container status \"de0cee10a375b83b7be675e7e77a0cdc76f28973acb0d1866a2d079b7973061b\": rpc error: code = NotFound desc = could not find container \"de0cee10a375b83b7be675e7e77a0cdc76f28973acb0d1866a2d079b7973061b\": container with ID starting with de0cee10a375b83b7be675e7e77a0cdc76f28973acb0d1866a2d079b7973061b not found: ID does not exist" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.731818 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f" path="/var/lib/kubelet/pods/86e1f5e3-3f4a-48b4-8c2c-da14be0fd37f/volumes" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.732326 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6a6f4e1-7593-427b-b430-42c7b351e652" path="/var/lib/kubelet/pods/a6a6f4e1-7593-427b-b430-42c7b351e652/volumes" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.733861 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g46wj\" (UniqueName: \"kubernetes.io/projected/bb81d2d8-c2ae-4d47-9982-39f5f741ea34-kube-api-access-g46wj\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.733898 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb81d2d8-c2ae-4d47-9982-39f5f741ea34-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.733911 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb81d2d8-c2ae-4d47-9982-39f5f741ea34-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.733921 4802 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb81d2d8-c2ae-4d47-9982-39f5f741ea34-logs\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:54 crc kubenswrapper[4802]: I1201 20:19:54.924543 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 20:19:54 crc kubenswrapper[4802]: W1201 20:19:54.942086 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30203d04_83ed_42d6_9f9e_e7db08604e70.slice/crio-6c8b5fd32d55e532f771586a91679b1198120b2cdaa01798a2f95bfd34fd9c1e WatchSource:0}: Error finding container 6c8b5fd32d55e532f771586a91679b1198120b2cdaa01798a2f95bfd34fd9c1e: Status 404 returned error can't find the container with id 6c8b5fd32d55e532f771586a91679b1198120b2cdaa01798a2f95bfd34fd9c1e Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.104762 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.119022 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.129004 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 20:19:55 crc kubenswrapper[4802]: E1201 20:19:55.129465 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb81d2d8-c2ae-4d47-9982-39f5f741ea34" containerName="nova-api-log" Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.129480 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb81d2d8-c2ae-4d47-9982-39f5f741ea34" containerName="nova-api-log" Dec 01 20:19:55 crc kubenswrapper[4802]: E1201 20:19:55.129502 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb81d2d8-c2ae-4d47-9982-39f5f741ea34" containerName="nova-api-api" Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.129508 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb81d2d8-c2ae-4d47-9982-39f5f741ea34" containerName="nova-api-api" Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.129671 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb81d2d8-c2ae-4d47-9982-39f5f741ea34" containerName="nova-api-log" Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.129685 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb81d2d8-c2ae-4d47-9982-39f5f741ea34" containerName="nova-api-api" Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.130662 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.132541 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.139544 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.242449 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9757fe0-f447-4ac9-8bb1-7576191b4418-logs\") pod \"nova-api-0\" (UID: \"d9757fe0-f447-4ac9-8bb1-7576191b4418\") " pod="openstack/nova-api-0" Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.242679 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhzbf\" (UniqueName: \"kubernetes.io/projected/d9757fe0-f447-4ac9-8bb1-7576191b4418-kube-api-access-bhzbf\") pod \"nova-api-0\" (UID: \"d9757fe0-f447-4ac9-8bb1-7576191b4418\") " pod="openstack/nova-api-0" Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.242754 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9757fe0-f447-4ac9-8bb1-7576191b4418-config-data\") pod \"nova-api-0\" (UID: \"d9757fe0-f447-4ac9-8bb1-7576191b4418\") " pod="openstack/nova-api-0" Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.242907 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9757fe0-f447-4ac9-8bb1-7576191b4418-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d9757fe0-f447-4ac9-8bb1-7576191b4418\") " pod="openstack/nova-api-0" Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.345167 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9757fe0-f447-4ac9-8bb1-7576191b4418-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d9757fe0-f447-4ac9-8bb1-7576191b4418\") " pod="openstack/nova-api-0" Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.345681 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9757fe0-f447-4ac9-8bb1-7576191b4418-logs\") pod \"nova-api-0\" (UID: \"d9757fe0-f447-4ac9-8bb1-7576191b4418\") " pod="openstack/nova-api-0" Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.345847 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhzbf\" (UniqueName: \"kubernetes.io/projected/d9757fe0-f447-4ac9-8bb1-7576191b4418-kube-api-access-bhzbf\") pod \"nova-api-0\" (UID: \"d9757fe0-f447-4ac9-8bb1-7576191b4418\") " pod="openstack/nova-api-0" Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.345949 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9757fe0-f447-4ac9-8bb1-7576191b4418-config-data\") pod \"nova-api-0\" (UID: \"d9757fe0-f447-4ac9-8bb1-7576191b4418\") " pod="openstack/nova-api-0" Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.346493 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9757fe0-f447-4ac9-8bb1-7576191b4418-logs\") pod \"nova-api-0\" (UID: \"d9757fe0-f447-4ac9-8bb1-7576191b4418\") " pod="openstack/nova-api-0" Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.352039 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9757fe0-f447-4ac9-8bb1-7576191b4418-config-data\") pod \"nova-api-0\" (UID: \"d9757fe0-f447-4ac9-8bb1-7576191b4418\") " pod="openstack/nova-api-0" Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.352846 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9757fe0-f447-4ac9-8bb1-7576191b4418-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d9757fe0-f447-4ac9-8bb1-7576191b4418\") " pod="openstack/nova-api-0" Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.364034 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhzbf\" (UniqueName: \"kubernetes.io/projected/d9757fe0-f447-4ac9-8bb1-7576191b4418-kube-api-access-bhzbf\") pod \"nova-api-0\" (UID: \"d9757fe0-f447-4ac9-8bb1-7576191b4418\") " pod="openstack/nova-api-0" Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.454501 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.677917 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30203d04-83ed-42d6-9f9e-e7db08604e70","Type":"ContainerStarted","Data":"577cff729acb0bd11fab8296589ff75931b69ace57247e4afb9b34552232dc30"} Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.677960 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30203d04-83ed-42d6-9f9e-e7db08604e70","Type":"ContainerStarted","Data":"6c8b5fd32d55e532f771586a91679b1198120b2cdaa01798a2f95bfd34fd9c1e"} Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.701878 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.7018574229999999 podStartE2EDuration="1.701857423s" podCreationTimestamp="2025-12-01 20:19:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:19:55.700037226 +0000 UTC m=+1417.262596877" watchObservedRunningTime="2025-12-01 20:19:55.701857423 +0000 UTC m=+1417.264417064" Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.902558 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 20:19:55 crc kubenswrapper[4802]: W1201 20:19:55.918252 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9757fe0_f447_4ac9_8bb1_7576191b4418.slice/crio-49875f918427fcf1b4302ab41e44f2103aedae5d750c4de567d02c752b69048c WatchSource:0}: Error finding container 49875f918427fcf1b4302ab41e44f2103aedae5d750c4de567d02c752b69048c: Status 404 returned error can't find the container with id 49875f918427fcf1b4302ab41e44f2103aedae5d750c4de567d02c752b69048c Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.921043 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 20:19:55 crc kubenswrapper[4802]: I1201 20:19:55.921094 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 20:19:56 crc kubenswrapper[4802]: I1201 20:19:56.688832 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3358c4e8-0931-4d2e-82d6-527c54f3537c","Type":"ContainerStarted","Data":"b81b1e643b3cec1e48e86c0eee9a815115354d0650c1a796ae5c9d80971931cb"} Dec 01 20:19:56 crc kubenswrapper[4802]: I1201 20:19:56.689161 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 01 20:19:56 crc kubenswrapper[4802]: I1201 20:19:56.692083 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d9757fe0-f447-4ac9-8bb1-7576191b4418","Type":"ContainerStarted","Data":"fbf442ffa23c05d13159af090a968d0f87022c799004fc540034f4be42509714"} Dec 01 20:19:56 crc kubenswrapper[4802]: I1201 20:19:56.692121 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d9757fe0-f447-4ac9-8bb1-7576191b4418","Type":"ContainerStarted","Data":"9f5ed18f2689f6b2c76edb1ea0f439d1d49747eba9c2c8b2a28a2fdebc608276"} Dec 01 20:19:56 crc kubenswrapper[4802]: I1201 20:19:56.692135 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d9757fe0-f447-4ac9-8bb1-7576191b4418","Type":"ContainerStarted","Data":"49875f918427fcf1b4302ab41e44f2103aedae5d750c4de567d02c752b69048c"} Dec 01 20:19:56 crc kubenswrapper[4802]: I1201 20:19:56.715085 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.283449327 podStartE2EDuration="3.71504786s" podCreationTimestamp="2025-12-01 20:19:53 +0000 UTC" firstStartedPulling="2025-12-01 20:19:54.494795189 +0000 UTC m=+1416.057354830" lastFinishedPulling="2025-12-01 20:19:55.926393722 +0000 UTC m=+1417.488953363" observedRunningTime="2025-12-01 20:19:56.702714474 +0000 UTC m=+1418.265274125" watchObservedRunningTime="2025-12-01 20:19:56.71504786 +0000 UTC m=+1418.277607551" Dec 01 20:19:56 crc kubenswrapper[4802]: I1201 20:19:56.734026 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb81d2d8-c2ae-4d47-9982-39f5f741ea34" path="/var/lib/kubelet/pods/bb81d2d8-c2ae-4d47-9982-39f5f741ea34/volumes" Dec 01 20:19:56 crc kubenswrapper[4802]: I1201 20:19:56.736271 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.7362545169999999 podStartE2EDuration="1.736254517s" podCreationTimestamp="2025-12-01 20:19:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:19:56.735442101 +0000 UTC m=+1418.298001762" watchObservedRunningTime="2025-12-01 20:19:56.736254517 +0000 UTC m=+1418.298814168" Dec 01 20:19:58 crc kubenswrapper[4802]: I1201 20:19:58.948782 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.330125 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.427446 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6v64\" (UniqueName: \"kubernetes.io/projected/fcc16b80-7583-4c97-a61c-da50c8c5750c-kube-api-access-x6v64\") pod \"fcc16b80-7583-4c97-a61c-da50c8c5750c\" (UID: \"fcc16b80-7583-4c97-a61c-da50c8c5750c\") " Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.427932 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcc16b80-7583-4c97-a61c-da50c8c5750c-scripts\") pod \"fcc16b80-7583-4c97-a61c-da50c8c5750c\" (UID: \"fcc16b80-7583-4c97-a61c-da50c8c5750c\") " Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.428148 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcc16b80-7583-4c97-a61c-da50c8c5750c-config-data\") pod \"fcc16b80-7583-4c97-a61c-da50c8c5750c\" (UID: \"fcc16b80-7583-4c97-a61c-da50c8c5750c\") " Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.428346 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcc16b80-7583-4c97-a61c-da50c8c5750c-run-httpd\") pod \"fcc16b80-7583-4c97-a61c-da50c8c5750c\" (UID: \"fcc16b80-7583-4c97-a61c-da50c8c5750c\") " Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.428479 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcc16b80-7583-4c97-a61c-da50c8c5750c-combined-ca-bundle\") pod \"fcc16b80-7583-4c97-a61c-da50c8c5750c\" (UID: \"fcc16b80-7583-4c97-a61c-da50c8c5750c\") " Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.428590 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcc16b80-7583-4c97-a61c-da50c8c5750c-log-httpd\") pod \"fcc16b80-7583-4c97-a61c-da50c8c5750c\" (UID: \"fcc16b80-7583-4c97-a61c-da50c8c5750c\") " Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.428705 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fcc16b80-7583-4c97-a61c-da50c8c5750c-sg-core-conf-yaml\") pod \"fcc16b80-7583-4c97-a61c-da50c8c5750c\" (UID: \"fcc16b80-7583-4c97-a61c-da50c8c5750c\") " Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.428611 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcc16b80-7583-4c97-a61c-da50c8c5750c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fcc16b80-7583-4c97-a61c-da50c8c5750c" (UID: "fcc16b80-7583-4c97-a61c-da50c8c5750c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.429049 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcc16b80-7583-4c97-a61c-da50c8c5750c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fcc16b80-7583-4c97-a61c-da50c8c5750c" (UID: "fcc16b80-7583-4c97-a61c-da50c8c5750c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.430028 4802 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcc16b80-7583-4c97-a61c-da50c8c5750c-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.430340 4802 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcc16b80-7583-4c97-a61c-da50c8c5750c-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.434450 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcc16b80-7583-4c97-a61c-da50c8c5750c-scripts" (OuterVolumeSpecName: "scripts") pod "fcc16b80-7583-4c97-a61c-da50c8c5750c" (UID: "fcc16b80-7583-4c97-a61c-da50c8c5750c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.436056 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcc16b80-7583-4c97-a61c-da50c8c5750c-kube-api-access-x6v64" (OuterVolumeSpecName: "kube-api-access-x6v64") pod "fcc16b80-7583-4c97-a61c-da50c8c5750c" (UID: "fcc16b80-7583-4c97-a61c-da50c8c5750c"). InnerVolumeSpecName "kube-api-access-x6v64". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.439292 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.468826 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcc16b80-7583-4c97-a61c-da50c8c5750c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fcc16b80-7583-4c97-a61c-da50c8c5750c" (UID: "fcc16b80-7583-4c97-a61c-da50c8c5750c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.499929 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcc16b80-7583-4c97-a61c-da50c8c5750c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fcc16b80-7583-4c97-a61c-da50c8c5750c" (UID: "fcc16b80-7583-4c97-a61c-da50c8c5750c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.532123 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6v64\" (UniqueName: \"kubernetes.io/projected/fcc16b80-7583-4c97-a61c-da50c8c5750c-kube-api-access-x6v64\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.532158 4802 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcc16b80-7583-4c97-a61c-da50c8c5750c-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.532174 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcc16b80-7583-4c97-a61c-da50c8c5750c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.532186 4802 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fcc16b80-7583-4c97-a61c-da50c8c5750c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.553435 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcc16b80-7583-4c97-a61c-da50c8c5750c-config-data" (OuterVolumeSpecName: "config-data") pod "fcc16b80-7583-4c97-a61c-da50c8c5750c" (UID: "fcc16b80-7583-4c97-a61c-da50c8c5750c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.634361 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcc16b80-7583-4c97-a61c-da50c8c5750c-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.726784 4802 generic.go:334] "Generic (PLEG): container finished" podID="fcc16b80-7583-4c97-a61c-da50c8c5750c" containerID="81f61150219c9141ca13739b67aa8f97e7171957427aec233c9de14dc3432b3f" exitCode=0 Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.726851 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fcc16b80-7583-4c97-a61c-da50c8c5750c","Type":"ContainerDied","Data":"81f61150219c9141ca13739b67aa8f97e7171957427aec233c9de14dc3432b3f"} Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.726946 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.727137 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fcc16b80-7583-4c97-a61c-da50c8c5750c","Type":"ContainerDied","Data":"f7a8b04385218b18d50de25716252197c1a91f9b0fd6a4350d95d4c763201a46"} Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.727157 4802 scope.go:117] "RemoveContainer" containerID="90d3bff4233d10c687c7ed41b3773a0fa73d8e85b71b8d1d69b110c6bde5b439" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.758805 4802 scope.go:117] "RemoveContainer" containerID="b993b23d61c4539ddb65c00d7e036a611b09afa4a0f1abb493c8ce447b1e4719" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.785886 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.788140 4802 scope.go:117] "RemoveContainer" containerID="81f61150219c9141ca13739b67aa8f97e7171957427aec233c9de14dc3432b3f" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.800118 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.815417 4802 scope.go:117] "RemoveContainer" containerID="1d68e00244dd60b3281ef276f7771d02a85bce38a5d587b1c5adc76188f8a4ad" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.822325 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:19:59 crc kubenswrapper[4802]: E1201 20:19:59.822860 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcc16b80-7583-4c97-a61c-da50c8c5750c" containerName="ceilometer-notification-agent" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.822885 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcc16b80-7583-4c97-a61c-da50c8c5750c" containerName="ceilometer-notification-agent" Dec 01 20:19:59 crc kubenswrapper[4802]: E1201 20:19:59.822919 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcc16b80-7583-4c97-a61c-da50c8c5750c" containerName="sg-core" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.822927 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcc16b80-7583-4c97-a61c-da50c8c5750c" containerName="sg-core" Dec 01 20:19:59 crc kubenswrapper[4802]: E1201 20:19:59.822947 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcc16b80-7583-4c97-a61c-da50c8c5750c" containerName="ceilometer-central-agent" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.822956 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcc16b80-7583-4c97-a61c-da50c8c5750c" containerName="ceilometer-central-agent" Dec 01 20:19:59 crc kubenswrapper[4802]: E1201 20:19:59.822965 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcc16b80-7583-4c97-a61c-da50c8c5750c" containerName="proxy-httpd" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.822971 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcc16b80-7583-4c97-a61c-da50c8c5750c" containerName="proxy-httpd" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.823169 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcc16b80-7583-4c97-a61c-da50c8c5750c" containerName="ceilometer-central-agent" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.823192 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcc16b80-7583-4c97-a61c-da50c8c5750c" containerName="ceilometer-notification-agent" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.823226 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcc16b80-7583-4c97-a61c-da50c8c5750c" containerName="sg-core" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.823242 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcc16b80-7583-4c97-a61c-da50c8c5750c" containerName="proxy-httpd" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.825427 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.828292 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.828529 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.828823 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.840640 4802 scope.go:117] "RemoveContainer" containerID="90d3bff4233d10c687c7ed41b3773a0fa73d8e85b71b8d1d69b110c6bde5b439" Dec 01 20:19:59 crc kubenswrapper[4802]: E1201 20:19:59.842260 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90d3bff4233d10c687c7ed41b3773a0fa73d8e85b71b8d1d69b110c6bde5b439\": container with ID starting with 90d3bff4233d10c687c7ed41b3773a0fa73d8e85b71b8d1d69b110c6bde5b439 not found: ID does not exist" containerID="90d3bff4233d10c687c7ed41b3773a0fa73d8e85b71b8d1d69b110c6bde5b439" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.842295 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90d3bff4233d10c687c7ed41b3773a0fa73d8e85b71b8d1d69b110c6bde5b439"} err="failed to get container status \"90d3bff4233d10c687c7ed41b3773a0fa73d8e85b71b8d1d69b110c6bde5b439\": rpc error: code = NotFound desc = could not find container \"90d3bff4233d10c687c7ed41b3773a0fa73d8e85b71b8d1d69b110c6bde5b439\": container with ID starting with 90d3bff4233d10c687c7ed41b3773a0fa73d8e85b71b8d1d69b110c6bde5b439 not found: ID does not exist" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.842325 4802 scope.go:117] "RemoveContainer" containerID="b993b23d61c4539ddb65c00d7e036a611b09afa4a0f1abb493c8ce447b1e4719" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.842465 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:19:59 crc kubenswrapper[4802]: E1201 20:19:59.842774 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b993b23d61c4539ddb65c00d7e036a611b09afa4a0f1abb493c8ce447b1e4719\": container with ID starting with b993b23d61c4539ddb65c00d7e036a611b09afa4a0f1abb493c8ce447b1e4719 not found: ID does not exist" containerID="b993b23d61c4539ddb65c00d7e036a611b09afa4a0f1abb493c8ce447b1e4719" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.842803 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b993b23d61c4539ddb65c00d7e036a611b09afa4a0f1abb493c8ce447b1e4719"} err="failed to get container status \"b993b23d61c4539ddb65c00d7e036a611b09afa4a0f1abb493c8ce447b1e4719\": rpc error: code = NotFound desc = could not find container \"b993b23d61c4539ddb65c00d7e036a611b09afa4a0f1abb493c8ce447b1e4719\": container with ID starting with b993b23d61c4539ddb65c00d7e036a611b09afa4a0f1abb493c8ce447b1e4719 not found: ID does not exist" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.842826 4802 scope.go:117] "RemoveContainer" containerID="81f61150219c9141ca13739b67aa8f97e7171957427aec233c9de14dc3432b3f" Dec 01 20:19:59 crc kubenswrapper[4802]: E1201 20:19:59.843441 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81f61150219c9141ca13739b67aa8f97e7171957427aec233c9de14dc3432b3f\": container with ID starting with 81f61150219c9141ca13739b67aa8f97e7171957427aec233c9de14dc3432b3f not found: ID does not exist" containerID="81f61150219c9141ca13739b67aa8f97e7171957427aec233c9de14dc3432b3f" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.843471 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81f61150219c9141ca13739b67aa8f97e7171957427aec233c9de14dc3432b3f"} err="failed to get container status \"81f61150219c9141ca13739b67aa8f97e7171957427aec233c9de14dc3432b3f\": rpc error: code = NotFound desc = could not find container \"81f61150219c9141ca13739b67aa8f97e7171957427aec233c9de14dc3432b3f\": container with ID starting with 81f61150219c9141ca13739b67aa8f97e7171957427aec233c9de14dc3432b3f not found: ID does not exist" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.843524 4802 scope.go:117] "RemoveContainer" containerID="1d68e00244dd60b3281ef276f7771d02a85bce38a5d587b1c5adc76188f8a4ad" Dec 01 20:19:59 crc kubenswrapper[4802]: E1201 20:19:59.848520 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d68e00244dd60b3281ef276f7771d02a85bce38a5d587b1c5adc76188f8a4ad\": container with ID starting with 1d68e00244dd60b3281ef276f7771d02a85bce38a5d587b1c5adc76188f8a4ad not found: ID does not exist" containerID="1d68e00244dd60b3281ef276f7771d02a85bce38a5d587b1c5adc76188f8a4ad" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.848607 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d68e00244dd60b3281ef276f7771d02a85bce38a5d587b1c5adc76188f8a4ad"} err="failed to get container status \"1d68e00244dd60b3281ef276f7771d02a85bce38a5d587b1c5adc76188f8a4ad\": rpc error: code = NotFound desc = could not find container \"1d68e00244dd60b3281ef276f7771d02a85bce38a5d587b1c5adc76188f8a4ad\": container with ID starting with 1d68e00244dd60b3281ef276f7771d02a85bce38a5d587b1c5adc76188f8a4ad not found: ID does not exist" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.939710 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " pod="openstack/ceilometer-0" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.939767 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " pod="openstack/ceilometer-0" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.939807 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-scripts\") pod \"ceilometer-0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " pod="openstack/ceilometer-0" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.939896 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " pod="openstack/ceilometer-0" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.940110 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-config-data\") pod \"ceilometer-0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " pod="openstack/ceilometer-0" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.940412 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a782c752-421f-4172-b3ef-5a2f6e772cd0-log-httpd\") pod \"ceilometer-0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " pod="openstack/ceilometer-0" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.940559 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a782c752-421f-4172-b3ef-5a2f6e772cd0-run-httpd\") pod \"ceilometer-0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " pod="openstack/ceilometer-0" Dec 01 20:19:59 crc kubenswrapper[4802]: I1201 20:19:59.940598 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbftq\" (UniqueName: \"kubernetes.io/projected/a782c752-421f-4172-b3ef-5a2f6e772cd0-kube-api-access-vbftq\") pod \"ceilometer-0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " pod="openstack/ceilometer-0" Dec 01 20:20:00 crc kubenswrapper[4802]: I1201 20:20:00.042491 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " pod="openstack/ceilometer-0" Dec 01 20:20:00 crc kubenswrapper[4802]: I1201 20:20:00.042560 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-config-data\") pod \"ceilometer-0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " pod="openstack/ceilometer-0" Dec 01 20:20:00 crc kubenswrapper[4802]: I1201 20:20:00.042630 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a782c752-421f-4172-b3ef-5a2f6e772cd0-log-httpd\") pod \"ceilometer-0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " pod="openstack/ceilometer-0" Dec 01 20:20:00 crc kubenswrapper[4802]: I1201 20:20:00.042678 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a782c752-421f-4172-b3ef-5a2f6e772cd0-run-httpd\") pod \"ceilometer-0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " pod="openstack/ceilometer-0" Dec 01 20:20:00 crc kubenswrapper[4802]: I1201 20:20:00.042710 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbftq\" (UniqueName: \"kubernetes.io/projected/a782c752-421f-4172-b3ef-5a2f6e772cd0-kube-api-access-vbftq\") pod \"ceilometer-0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " pod="openstack/ceilometer-0" Dec 01 20:20:00 crc kubenswrapper[4802]: I1201 20:20:00.042754 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " pod="openstack/ceilometer-0" Dec 01 20:20:00 crc kubenswrapper[4802]: I1201 20:20:00.042783 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " pod="openstack/ceilometer-0" Dec 01 20:20:00 crc kubenswrapper[4802]: I1201 20:20:00.042825 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-scripts\") pod \"ceilometer-0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " pod="openstack/ceilometer-0" Dec 01 20:20:00 crc kubenswrapper[4802]: I1201 20:20:00.043498 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a782c752-421f-4172-b3ef-5a2f6e772cd0-run-httpd\") pod \"ceilometer-0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " pod="openstack/ceilometer-0" Dec 01 20:20:00 crc kubenswrapper[4802]: I1201 20:20:00.043545 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a782c752-421f-4172-b3ef-5a2f6e772cd0-log-httpd\") pod \"ceilometer-0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " pod="openstack/ceilometer-0" Dec 01 20:20:00 crc kubenswrapper[4802]: I1201 20:20:00.049694 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " pod="openstack/ceilometer-0" Dec 01 20:20:00 crc kubenswrapper[4802]: I1201 20:20:00.049901 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-scripts\") pod \"ceilometer-0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " pod="openstack/ceilometer-0" Dec 01 20:20:00 crc kubenswrapper[4802]: I1201 20:20:00.050085 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " pod="openstack/ceilometer-0" Dec 01 20:20:00 crc kubenswrapper[4802]: I1201 20:20:00.050805 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " pod="openstack/ceilometer-0" Dec 01 20:20:00 crc kubenswrapper[4802]: I1201 20:20:00.055766 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-config-data\") pod \"ceilometer-0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " pod="openstack/ceilometer-0" Dec 01 20:20:00 crc kubenswrapper[4802]: I1201 20:20:00.068132 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbftq\" (UniqueName: \"kubernetes.io/projected/a782c752-421f-4172-b3ef-5a2f6e772cd0-kube-api-access-vbftq\") pod \"ceilometer-0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " pod="openstack/ceilometer-0" Dec 01 20:20:00 crc kubenswrapper[4802]: I1201 20:20:00.144425 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:20:00 crc kubenswrapper[4802]: I1201 20:20:00.640645 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:20:00 crc kubenswrapper[4802]: I1201 20:20:00.734457 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcc16b80-7583-4c97-a61c-da50c8c5750c" path="/var/lib/kubelet/pods/fcc16b80-7583-4c97-a61c-da50c8c5750c/volumes" Dec 01 20:20:00 crc kubenswrapper[4802]: I1201 20:20:00.737645 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a782c752-421f-4172-b3ef-5a2f6e772cd0","Type":"ContainerStarted","Data":"8986a326c19cf3458d030d63d6d6dbb0efc04965184395a66e03b16441c73534"} Dec 01 20:20:00 crc kubenswrapper[4802]: I1201 20:20:00.921579 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 20:20:00 crc kubenswrapper[4802]: I1201 20:20:00.921939 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 20:20:01 crc kubenswrapper[4802]: I1201 20:20:01.752695 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a782c752-421f-4172-b3ef-5a2f6e772cd0","Type":"ContainerStarted","Data":"1926ad08a845c2a56803b986f9babbd8b6644d06a519e8b9335174ec14f2dd48"} Dec 01 20:20:01 crc kubenswrapper[4802]: I1201 20:20:01.935602 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2f6e2953-0b0e-4e75-b450-03e3faf4bdf6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.175:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 20:20:01 crc kubenswrapper[4802]: I1201 20:20:01.935875 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2f6e2953-0b0e-4e75-b450-03e3faf4bdf6" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.175:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 20:20:02 crc kubenswrapper[4802]: I1201 20:20:02.766880 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a782c752-421f-4172-b3ef-5a2f6e772cd0","Type":"ContainerStarted","Data":"67c0d211c4ddfbac73101ba431d03e308fcea674191b50977d1a2ba6726543de"} Dec 01 20:20:03 crc kubenswrapper[4802]: I1201 20:20:03.779752 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a782c752-421f-4172-b3ef-5a2f6e772cd0","Type":"ContainerStarted","Data":"ba6eb83945068564450ea89f9e20c4251a23a8ccf30ad37343b4bce181160dfb"} Dec 01 20:20:04 crc kubenswrapper[4802]: I1201 20:20:04.029134 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 01 20:20:04 crc kubenswrapper[4802]: I1201 20:20:04.440166 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 20:20:04 crc kubenswrapper[4802]: I1201 20:20:04.486210 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 20:20:04 crc kubenswrapper[4802]: I1201 20:20:04.823927 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 20:20:05 crc kubenswrapper[4802]: I1201 20:20:05.455143 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 20:20:05 crc kubenswrapper[4802]: I1201 20:20:05.455454 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 20:20:05 crc kubenswrapper[4802]: I1201 20:20:05.801561 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a782c752-421f-4172-b3ef-5a2f6e772cd0","Type":"ContainerStarted","Data":"f67121a083a2a59cd5d0ed20d392e6fdca4e22d6251ccd2e2935c678389cc25a"} Dec 01 20:20:05 crc kubenswrapper[4802]: I1201 20:20:05.825122 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.580247456 podStartE2EDuration="6.825105997s" podCreationTimestamp="2025-12-01 20:19:59 +0000 UTC" firstStartedPulling="2025-12-01 20:20:00.657179058 +0000 UTC m=+1422.219738699" lastFinishedPulling="2025-12-01 20:20:04.902037599 +0000 UTC m=+1426.464597240" observedRunningTime="2025-12-01 20:20:05.822483935 +0000 UTC m=+1427.385043566" watchObservedRunningTime="2025-12-01 20:20:05.825105997 +0000 UTC m=+1427.387665628" Dec 01 20:20:06 crc kubenswrapper[4802]: I1201 20:20:06.538335 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d9757fe0-f447-4ac9-8bb1-7576191b4418" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.178:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 20:20:06 crc kubenswrapper[4802]: I1201 20:20:06.538349 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d9757fe0-f447-4ac9-8bb1-7576191b4418" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.178:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 20:20:06 crc kubenswrapper[4802]: I1201 20:20:06.825379 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 20:20:10 crc kubenswrapper[4802]: I1201 20:20:10.928505 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 20:20:10 crc kubenswrapper[4802]: I1201 20:20:10.934704 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 20:20:10 crc kubenswrapper[4802]: I1201 20:20:10.936151 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 20:20:11 crc kubenswrapper[4802]: I1201 20:20:11.876178 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 20:20:12 crc kubenswrapper[4802]: I1201 20:20:12.881614 4802 generic.go:334] "Generic (PLEG): container finished" podID="4a49dde2-367a-4cd0-8bc3-60aea4c2dd73" containerID="efb7abd5a3d6121fa75b809649c51c4cbfce30be0a3eefea3319993626982dcd" exitCode=137 Dec 01 20:20:12 crc kubenswrapper[4802]: I1201 20:20:12.881660 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4a49dde2-367a-4cd0-8bc3-60aea4c2dd73","Type":"ContainerDied","Data":"efb7abd5a3d6121fa75b809649c51c4cbfce30be0a3eefea3319993626982dcd"} Dec 01 20:20:13 crc kubenswrapper[4802]: I1201 20:20:13.265155 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:20:13 crc kubenswrapper[4802]: I1201 20:20:13.431685 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgfj9\" (UniqueName: \"kubernetes.io/projected/4a49dde2-367a-4cd0-8bc3-60aea4c2dd73-kube-api-access-tgfj9\") pod \"4a49dde2-367a-4cd0-8bc3-60aea4c2dd73\" (UID: \"4a49dde2-367a-4cd0-8bc3-60aea4c2dd73\") " Dec 01 20:20:13 crc kubenswrapper[4802]: I1201 20:20:13.431961 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a49dde2-367a-4cd0-8bc3-60aea4c2dd73-combined-ca-bundle\") pod \"4a49dde2-367a-4cd0-8bc3-60aea4c2dd73\" (UID: \"4a49dde2-367a-4cd0-8bc3-60aea4c2dd73\") " Dec 01 20:20:13 crc kubenswrapper[4802]: I1201 20:20:13.432154 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a49dde2-367a-4cd0-8bc3-60aea4c2dd73-config-data\") pod \"4a49dde2-367a-4cd0-8bc3-60aea4c2dd73\" (UID: \"4a49dde2-367a-4cd0-8bc3-60aea4c2dd73\") " Dec 01 20:20:13 crc kubenswrapper[4802]: I1201 20:20:13.438880 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a49dde2-367a-4cd0-8bc3-60aea4c2dd73-kube-api-access-tgfj9" (OuterVolumeSpecName: "kube-api-access-tgfj9") pod "4a49dde2-367a-4cd0-8bc3-60aea4c2dd73" (UID: "4a49dde2-367a-4cd0-8bc3-60aea4c2dd73"). InnerVolumeSpecName "kube-api-access-tgfj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:20:13 crc kubenswrapper[4802]: I1201 20:20:13.460713 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a49dde2-367a-4cd0-8bc3-60aea4c2dd73-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a49dde2-367a-4cd0-8bc3-60aea4c2dd73" (UID: "4a49dde2-367a-4cd0-8bc3-60aea4c2dd73"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:20:13 crc kubenswrapper[4802]: I1201 20:20:13.469857 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a49dde2-367a-4cd0-8bc3-60aea4c2dd73-config-data" (OuterVolumeSpecName: "config-data") pod "4a49dde2-367a-4cd0-8bc3-60aea4c2dd73" (UID: "4a49dde2-367a-4cd0-8bc3-60aea4c2dd73"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:20:13 crc kubenswrapper[4802]: I1201 20:20:13.534542 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgfj9\" (UniqueName: \"kubernetes.io/projected/4a49dde2-367a-4cd0-8bc3-60aea4c2dd73-kube-api-access-tgfj9\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:13 crc kubenswrapper[4802]: I1201 20:20:13.534862 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a49dde2-367a-4cd0-8bc3-60aea4c2dd73-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:13 crc kubenswrapper[4802]: I1201 20:20:13.534875 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a49dde2-367a-4cd0-8bc3-60aea4c2dd73-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:13 crc kubenswrapper[4802]: I1201 20:20:13.894151 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:20:13 crc kubenswrapper[4802]: I1201 20:20:13.894338 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4a49dde2-367a-4cd0-8bc3-60aea4c2dd73","Type":"ContainerDied","Data":"44f151a28426228d7a60a28ab4d6f79b3fc6239f16837b31bc7f71a4848c90cf"} Dec 01 20:20:13 crc kubenswrapper[4802]: I1201 20:20:13.894374 4802 scope.go:117] "RemoveContainer" containerID="efb7abd5a3d6121fa75b809649c51c4cbfce30be0a3eefea3319993626982dcd" Dec 01 20:20:13 crc kubenswrapper[4802]: I1201 20:20:13.946597 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 20:20:13 crc kubenswrapper[4802]: I1201 20:20:13.962839 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 20:20:13 crc kubenswrapper[4802]: I1201 20:20:13.973154 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 20:20:13 crc kubenswrapper[4802]: E1201 20:20:13.973672 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a49dde2-367a-4cd0-8bc3-60aea4c2dd73" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 20:20:13 crc kubenswrapper[4802]: I1201 20:20:13.973689 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a49dde2-367a-4cd0-8bc3-60aea4c2dd73" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 20:20:13 crc kubenswrapper[4802]: I1201 20:20:13.973885 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a49dde2-367a-4cd0-8bc3-60aea4c2dd73" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 20:20:13 crc kubenswrapper[4802]: I1201 20:20:13.974626 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:20:13 crc kubenswrapper[4802]: I1201 20:20:13.977087 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 01 20:20:13 crc kubenswrapper[4802]: I1201 20:20:13.977490 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 01 20:20:13 crc kubenswrapper[4802]: I1201 20:20:13.977729 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 01 20:20:13 crc kubenswrapper[4802]: I1201 20:20:13.987251 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 20:20:14 crc kubenswrapper[4802]: I1201 20:20:14.146097 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66cec21d-0c5f-4b29-8268-fb8f64d68bfb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"66cec21d-0c5f-4b29-8268-fb8f64d68bfb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:20:14 crc kubenswrapper[4802]: I1201 20:20:14.146261 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/66cec21d-0c5f-4b29-8268-fb8f64d68bfb-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"66cec21d-0c5f-4b29-8268-fb8f64d68bfb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:20:14 crc kubenswrapper[4802]: I1201 20:20:14.146349 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkg7f\" (UniqueName: \"kubernetes.io/projected/66cec21d-0c5f-4b29-8268-fb8f64d68bfb-kube-api-access-pkg7f\") pod \"nova-cell1-novncproxy-0\" (UID: \"66cec21d-0c5f-4b29-8268-fb8f64d68bfb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:20:14 crc kubenswrapper[4802]: I1201 20:20:14.146437 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/66cec21d-0c5f-4b29-8268-fb8f64d68bfb-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"66cec21d-0c5f-4b29-8268-fb8f64d68bfb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:20:14 crc kubenswrapper[4802]: I1201 20:20:14.146490 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66cec21d-0c5f-4b29-8268-fb8f64d68bfb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"66cec21d-0c5f-4b29-8268-fb8f64d68bfb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:20:14 crc kubenswrapper[4802]: I1201 20:20:14.248319 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66cec21d-0c5f-4b29-8268-fb8f64d68bfb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"66cec21d-0c5f-4b29-8268-fb8f64d68bfb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:20:14 crc kubenswrapper[4802]: I1201 20:20:14.248419 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/66cec21d-0c5f-4b29-8268-fb8f64d68bfb-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"66cec21d-0c5f-4b29-8268-fb8f64d68bfb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:20:14 crc kubenswrapper[4802]: I1201 20:20:14.248480 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkg7f\" (UniqueName: \"kubernetes.io/projected/66cec21d-0c5f-4b29-8268-fb8f64d68bfb-kube-api-access-pkg7f\") pod \"nova-cell1-novncproxy-0\" (UID: \"66cec21d-0c5f-4b29-8268-fb8f64d68bfb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:20:14 crc kubenswrapper[4802]: I1201 20:20:14.248534 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/66cec21d-0c5f-4b29-8268-fb8f64d68bfb-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"66cec21d-0c5f-4b29-8268-fb8f64d68bfb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:20:14 crc kubenswrapper[4802]: I1201 20:20:14.248566 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66cec21d-0c5f-4b29-8268-fb8f64d68bfb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"66cec21d-0c5f-4b29-8268-fb8f64d68bfb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:20:14 crc kubenswrapper[4802]: I1201 20:20:14.255960 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66cec21d-0c5f-4b29-8268-fb8f64d68bfb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"66cec21d-0c5f-4b29-8268-fb8f64d68bfb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:20:14 crc kubenswrapper[4802]: I1201 20:20:14.255987 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/66cec21d-0c5f-4b29-8268-fb8f64d68bfb-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"66cec21d-0c5f-4b29-8268-fb8f64d68bfb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:20:14 crc kubenswrapper[4802]: I1201 20:20:14.259307 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66cec21d-0c5f-4b29-8268-fb8f64d68bfb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"66cec21d-0c5f-4b29-8268-fb8f64d68bfb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:20:14 crc kubenswrapper[4802]: I1201 20:20:14.260345 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/66cec21d-0c5f-4b29-8268-fb8f64d68bfb-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"66cec21d-0c5f-4b29-8268-fb8f64d68bfb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:20:14 crc kubenswrapper[4802]: I1201 20:20:14.282808 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkg7f\" (UniqueName: \"kubernetes.io/projected/66cec21d-0c5f-4b29-8268-fb8f64d68bfb-kube-api-access-pkg7f\") pod \"nova-cell1-novncproxy-0\" (UID: \"66cec21d-0c5f-4b29-8268-fb8f64d68bfb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:20:14 crc kubenswrapper[4802]: I1201 20:20:14.310982 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:20:14 crc kubenswrapper[4802]: I1201 20:20:14.735639 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a49dde2-367a-4cd0-8bc3-60aea4c2dd73" path="/var/lib/kubelet/pods/4a49dde2-367a-4cd0-8bc3-60aea4c2dd73/volumes" Dec 01 20:20:14 crc kubenswrapper[4802]: I1201 20:20:14.790089 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 20:20:14 crc kubenswrapper[4802]: I1201 20:20:14.902599 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"66cec21d-0c5f-4b29-8268-fb8f64d68bfb","Type":"ContainerStarted","Data":"647fcf985630ac6b0816fcd6805086d1b830318bedb7ad032a6f39c1bfd07d2f"} Dec 01 20:20:15 crc kubenswrapper[4802]: I1201 20:20:15.462113 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 20:20:15 crc kubenswrapper[4802]: I1201 20:20:15.462874 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 20:20:15 crc kubenswrapper[4802]: I1201 20:20:15.465459 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 20:20:15 crc kubenswrapper[4802]: I1201 20:20:15.468480 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 20:20:15 crc kubenswrapper[4802]: I1201 20:20:15.923573 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"66cec21d-0c5f-4b29-8268-fb8f64d68bfb","Type":"ContainerStarted","Data":"68a06cdab4037a524309dd239c933f7faa83499dbb4169ac72dc0108634ed20a"} Dec 01 20:20:15 crc kubenswrapper[4802]: I1201 20:20:15.923639 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 20:20:15 crc kubenswrapper[4802]: I1201 20:20:15.930718 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 20:20:15 crc kubenswrapper[4802]: I1201 20:20:15.955797 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.955758883 podStartE2EDuration="2.955758883s" podCreationTimestamp="2025-12-01 20:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:20:15.943753561 +0000 UTC m=+1437.506313222" watchObservedRunningTime="2025-12-01 20:20:15.955758883 +0000 UTC m=+1437.518318534" Dec 01 20:20:16 crc kubenswrapper[4802]: I1201 20:20:16.156377 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-frqg6"] Dec 01 20:20:16 crc kubenswrapper[4802]: I1201 20:20:16.158618 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-frqg6" Dec 01 20:20:16 crc kubenswrapper[4802]: I1201 20:20:16.238124 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-frqg6"] Dec 01 20:20:16 crc kubenswrapper[4802]: I1201 20:20:16.300383 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-frqg6\" (UID: \"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6\") " pod="openstack/dnsmasq-dns-5b856c5697-frqg6" Dec 01 20:20:16 crc kubenswrapper[4802]: I1201 20:20:16.300461 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-frqg6\" (UID: \"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6\") " pod="openstack/dnsmasq-dns-5b856c5697-frqg6" Dec 01 20:20:16 crc kubenswrapper[4802]: I1201 20:20:16.300567 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-dns-svc\") pod \"dnsmasq-dns-5b856c5697-frqg6\" (UID: \"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6\") " pod="openstack/dnsmasq-dns-5b856c5697-frqg6" Dec 01 20:20:16 crc kubenswrapper[4802]: I1201 20:20:16.300600 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-config\") pod \"dnsmasq-dns-5b856c5697-frqg6\" (UID: \"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6\") " pod="openstack/dnsmasq-dns-5b856c5697-frqg6" Dec 01 20:20:16 crc kubenswrapper[4802]: I1201 20:20:16.300647 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q4s9\" (UniqueName: \"kubernetes.io/projected/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-kube-api-access-9q4s9\") pod \"dnsmasq-dns-5b856c5697-frqg6\" (UID: \"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6\") " pod="openstack/dnsmasq-dns-5b856c5697-frqg6" Dec 01 20:20:16 crc kubenswrapper[4802]: I1201 20:20:16.403889 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-frqg6\" (UID: \"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6\") " pod="openstack/dnsmasq-dns-5b856c5697-frqg6" Dec 01 20:20:16 crc kubenswrapper[4802]: I1201 20:20:16.403949 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-frqg6\" (UID: \"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6\") " pod="openstack/dnsmasq-dns-5b856c5697-frqg6" Dec 01 20:20:16 crc kubenswrapper[4802]: I1201 20:20:16.404033 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-dns-svc\") pod \"dnsmasq-dns-5b856c5697-frqg6\" (UID: \"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6\") " pod="openstack/dnsmasq-dns-5b856c5697-frqg6" Dec 01 20:20:16 crc kubenswrapper[4802]: I1201 20:20:16.404056 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-config\") pod \"dnsmasq-dns-5b856c5697-frqg6\" (UID: \"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6\") " pod="openstack/dnsmasq-dns-5b856c5697-frqg6" Dec 01 20:20:16 crc kubenswrapper[4802]: I1201 20:20:16.404101 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q4s9\" (UniqueName: \"kubernetes.io/projected/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-kube-api-access-9q4s9\") pod \"dnsmasq-dns-5b856c5697-frqg6\" (UID: \"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6\") " pod="openstack/dnsmasq-dns-5b856c5697-frqg6" Dec 01 20:20:16 crc kubenswrapper[4802]: I1201 20:20:16.404954 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-frqg6\" (UID: \"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6\") " pod="openstack/dnsmasq-dns-5b856c5697-frqg6" Dec 01 20:20:16 crc kubenswrapper[4802]: I1201 20:20:16.404957 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-dns-svc\") pod \"dnsmasq-dns-5b856c5697-frqg6\" (UID: \"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6\") " pod="openstack/dnsmasq-dns-5b856c5697-frqg6" Dec 01 20:20:16 crc kubenswrapper[4802]: I1201 20:20:16.405097 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-frqg6\" (UID: \"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6\") " pod="openstack/dnsmasq-dns-5b856c5697-frqg6" Dec 01 20:20:16 crc kubenswrapper[4802]: I1201 20:20:16.405124 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-config\") pod \"dnsmasq-dns-5b856c5697-frqg6\" (UID: \"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6\") " pod="openstack/dnsmasq-dns-5b856c5697-frqg6" Dec 01 20:20:16 crc kubenswrapper[4802]: I1201 20:20:16.425392 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q4s9\" (UniqueName: \"kubernetes.io/projected/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-kube-api-access-9q4s9\") pod \"dnsmasq-dns-5b856c5697-frqg6\" (UID: \"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6\") " pod="openstack/dnsmasq-dns-5b856c5697-frqg6" Dec 01 20:20:16 crc kubenswrapper[4802]: I1201 20:20:16.504721 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-frqg6" Dec 01 20:20:16 crc kubenswrapper[4802]: I1201 20:20:16.997748 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-frqg6"] Dec 01 20:20:17 crc kubenswrapper[4802]: W1201 20:20:17.003037 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebd2de8c_0e21_4d6c_9d1d_d74600eb8ff6.slice/crio-944e2985a3c1769855993a31a4bbfced240775d457af31a07bd21bac414dee57 WatchSource:0}: Error finding container 944e2985a3c1769855993a31a4bbfced240775d457af31a07bd21bac414dee57: Status 404 returned error can't find the container with id 944e2985a3c1769855993a31a4bbfced240775d457af31a07bd21bac414dee57 Dec 01 20:20:17 crc kubenswrapper[4802]: I1201 20:20:17.938563 4802 generic.go:334] "Generic (PLEG): container finished" podID="ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6" containerID="791ffee4271476684fca7f33a2ce01946474fbd2f9c2d784fe9cacab5f50044c" exitCode=0 Dec 01 20:20:17 crc kubenswrapper[4802]: I1201 20:20:17.940395 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-frqg6" event={"ID":"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6","Type":"ContainerDied","Data":"791ffee4271476684fca7f33a2ce01946474fbd2f9c2d784fe9cacab5f50044c"} Dec 01 20:20:17 crc kubenswrapper[4802]: I1201 20:20:17.940433 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-frqg6" event={"ID":"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6","Type":"ContainerStarted","Data":"944e2985a3c1769855993a31a4bbfced240775d457af31a07bd21bac414dee57"} Dec 01 20:20:18 crc kubenswrapper[4802]: I1201 20:20:18.399462 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:20:18 crc kubenswrapper[4802]: I1201 20:20:18.399786 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a782c752-421f-4172-b3ef-5a2f6e772cd0" containerName="ceilometer-central-agent" containerID="cri-o://1926ad08a845c2a56803b986f9babbd8b6644d06a519e8b9335174ec14f2dd48" gracePeriod=30 Dec 01 20:20:18 crc kubenswrapper[4802]: I1201 20:20:18.399932 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a782c752-421f-4172-b3ef-5a2f6e772cd0" containerName="proxy-httpd" containerID="cri-o://f67121a083a2a59cd5d0ed20d392e6fdca4e22d6251ccd2e2935c678389cc25a" gracePeriod=30 Dec 01 20:20:18 crc kubenswrapper[4802]: I1201 20:20:18.399988 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a782c752-421f-4172-b3ef-5a2f6e772cd0" containerName="sg-core" containerID="cri-o://ba6eb83945068564450ea89f9e20c4251a23a8ccf30ad37343b4bce181160dfb" gracePeriod=30 Dec 01 20:20:18 crc kubenswrapper[4802]: I1201 20:20:18.400026 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a782c752-421f-4172-b3ef-5a2f6e772cd0" containerName="ceilometer-notification-agent" containerID="cri-o://67c0d211c4ddfbac73101ba431d03e308fcea674191b50977d1a2ba6726543de" gracePeriod=30 Dec 01 20:20:18 crc kubenswrapper[4802]: I1201 20:20:18.407008 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 20:20:18 crc kubenswrapper[4802]: I1201 20:20:18.626667 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 20:20:18 crc kubenswrapper[4802]: I1201 20:20:18.954029 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-frqg6" event={"ID":"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6","Type":"ContainerStarted","Data":"f353c04e5125e27586c04b29c7eb4c1d8d485e51ad407dedf67d73eb7d9d3b45"} Dec 01 20:20:18 crc kubenswrapper[4802]: I1201 20:20:18.954285 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b856c5697-frqg6" Dec 01 20:20:18 crc kubenswrapper[4802]: I1201 20:20:18.964324 4802 generic.go:334] "Generic (PLEG): container finished" podID="a782c752-421f-4172-b3ef-5a2f6e772cd0" containerID="f67121a083a2a59cd5d0ed20d392e6fdca4e22d6251ccd2e2935c678389cc25a" exitCode=0 Dec 01 20:20:18 crc kubenswrapper[4802]: I1201 20:20:18.964364 4802 generic.go:334] "Generic (PLEG): container finished" podID="a782c752-421f-4172-b3ef-5a2f6e772cd0" containerID="ba6eb83945068564450ea89f9e20c4251a23a8ccf30ad37343b4bce181160dfb" exitCode=2 Dec 01 20:20:18 crc kubenswrapper[4802]: I1201 20:20:18.964375 4802 generic.go:334] "Generic (PLEG): container finished" podID="a782c752-421f-4172-b3ef-5a2f6e772cd0" containerID="1926ad08a845c2a56803b986f9babbd8b6644d06a519e8b9335174ec14f2dd48" exitCode=0 Dec 01 20:20:18 crc kubenswrapper[4802]: I1201 20:20:18.964590 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d9757fe0-f447-4ac9-8bb1-7576191b4418" containerName="nova-api-log" containerID="cri-o://9f5ed18f2689f6b2c76edb1ea0f439d1d49747eba9c2c8b2a28a2fdebc608276" gracePeriod=30 Dec 01 20:20:18 crc kubenswrapper[4802]: I1201 20:20:18.964883 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a782c752-421f-4172-b3ef-5a2f6e772cd0","Type":"ContainerDied","Data":"f67121a083a2a59cd5d0ed20d392e6fdca4e22d6251ccd2e2935c678389cc25a"} Dec 01 20:20:18 crc kubenswrapper[4802]: I1201 20:20:18.964922 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a782c752-421f-4172-b3ef-5a2f6e772cd0","Type":"ContainerDied","Data":"ba6eb83945068564450ea89f9e20c4251a23a8ccf30ad37343b4bce181160dfb"} Dec 01 20:20:18 crc kubenswrapper[4802]: I1201 20:20:18.964938 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a782c752-421f-4172-b3ef-5a2f6e772cd0","Type":"ContainerDied","Data":"1926ad08a845c2a56803b986f9babbd8b6644d06a519e8b9335174ec14f2dd48"} Dec 01 20:20:18 crc kubenswrapper[4802]: I1201 20:20:18.965001 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d9757fe0-f447-4ac9-8bb1-7576191b4418" containerName="nova-api-api" containerID="cri-o://fbf442ffa23c05d13159af090a968d0f87022c799004fc540034f4be42509714" gracePeriod=30 Dec 01 20:20:18 crc kubenswrapper[4802]: I1201 20:20:18.979912 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b856c5697-frqg6" podStartSLOduration=2.979892117 podStartE2EDuration="2.979892117s" podCreationTimestamp="2025-12-01 20:20:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:20:18.97191683 +0000 UTC m=+1440.534476481" watchObservedRunningTime="2025-12-01 20:20:18.979892117 +0000 UTC m=+1440.542451758" Dec 01 20:20:19 crc kubenswrapper[4802]: I1201 20:20:19.311285 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:20:19 crc kubenswrapper[4802]: I1201 20:20:19.979506 4802 generic.go:334] "Generic (PLEG): container finished" podID="d9757fe0-f447-4ac9-8bb1-7576191b4418" containerID="9f5ed18f2689f6b2c76edb1ea0f439d1d49747eba9c2c8b2a28a2fdebc608276" exitCode=143 Dec 01 20:20:19 crc kubenswrapper[4802]: I1201 20:20:19.979585 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d9757fe0-f447-4ac9-8bb1-7576191b4418","Type":"ContainerDied","Data":"9f5ed18f2689f6b2c76edb1ea0f439d1d49747eba9c2c8b2a28a2fdebc608276"} Dec 01 20:20:23 crc kubenswrapper[4802]: I1201 20:20:23.007782 4802 generic.go:334] "Generic (PLEG): container finished" podID="d9757fe0-f447-4ac9-8bb1-7576191b4418" containerID="fbf442ffa23c05d13159af090a968d0f87022c799004fc540034f4be42509714" exitCode=0 Dec 01 20:20:23 crc kubenswrapper[4802]: I1201 20:20:23.007870 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d9757fe0-f447-4ac9-8bb1-7576191b4418","Type":"ContainerDied","Data":"fbf442ffa23c05d13159af090a968d0f87022c799004fc540034f4be42509714"} Dec 01 20:20:23 crc kubenswrapper[4802]: I1201 20:20:23.008254 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d9757fe0-f447-4ac9-8bb1-7576191b4418","Type":"ContainerDied","Data":"49875f918427fcf1b4302ab41e44f2103aedae5d750c4de567d02c752b69048c"} Dec 01 20:20:23 crc kubenswrapper[4802]: I1201 20:20:23.008269 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49875f918427fcf1b4302ab41e44f2103aedae5d750c4de567d02c752b69048c" Dec 01 20:20:23 crc kubenswrapper[4802]: I1201 20:20:23.076223 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 20:20:23 crc kubenswrapper[4802]: I1201 20:20:23.221557 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhzbf\" (UniqueName: \"kubernetes.io/projected/d9757fe0-f447-4ac9-8bb1-7576191b4418-kube-api-access-bhzbf\") pod \"d9757fe0-f447-4ac9-8bb1-7576191b4418\" (UID: \"d9757fe0-f447-4ac9-8bb1-7576191b4418\") " Dec 01 20:20:23 crc kubenswrapper[4802]: I1201 20:20:23.221866 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9757fe0-f447-4ac9-8bb1-7576191b4418-logs\") pod \"d9757fe0-f447-4ac9-8bb1-7576191b4418\" (UID: \"d9757fe0-f447-4ac9-8bb1-7576191b4418\") " Dec 01 20:20:23 crc kubenswrapper[4802]: I1201 20:20:23.222038 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9757fe0-f447-4ac9-8bb1-7576191b4418-combined-ca-bundle\") pod \"d9757fe0-f447-4ac9-8bb1-7576191b4418\" (UID: \"d9757fe0-f447-4ac9-8bb1-7576191b4418\") " Dec 01 20:20:23 crc kubenswrapper[4802]: I1201 20:20:23.222084 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9757fe0-f447-4ac9-8bb1-7576191b4418-config-data\") pod \"d9757fe0-f447-4ac9-8bb1-7576191b4418\" (UID: \"d9757fe0-f447-4ac9-8bb1-7576191b4418\") " Dec 01 20:20:23 crc kubenswrapper[4802]: I1201 20:20:23.227379 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9757fe0-f447-4ac9-8bb1-7576191b4418-logs" (OuterVolumeSpecName: "logs") pod "d9757fe0-f447-4ac9-8bb1-7576191b4418" (UID: "d9757fe0-f447-4ac9-8bb1-7576191b4418"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:20:23 crc kubenswrapper[4802]: I1201 20:20:23.228026 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9757fe0-f447-4ac9-8bb1-7576191b4418-kube-api-access-bhzbf" (OuterVolumeSpecName: "kube-api-access-bhzbf") pod "d9757fe0-f447-4ac9-8bb1-7576191b4418" (UID: "d9757fe0-f447-4ac9-8bb1-7576191b4418"). InnerVolumeSpecName "kube-api-access-bhzbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:20:23 crc kubenswrapper[4802]: I1201 20:20:23.256215 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9757fe0-f447-4ac9-8bb1-7576191b4418-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9757fe0-f447-4ac9-8bb1-7576191b4418" (UID: "d9757fe0-f447-4ac9-8bb1-7576191b4418"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:20:23 crc kubenswrapper[4802]: I1201 20:20:23.266823 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9757fe0-f447-4ac9-8bb1-7576191b4418-config-data" (OuterVolumeSpecName: "config-data") pod "d9757fe0-f447-4ac9-8bb1-7576191b4418" (UID: "d9757fe0-f447-4ac9-8bb1-7576191b4418"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:20:23 crc kubenswrapper[4802]: I1201 20:20:23.326575 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9757fe0-f447-4ac9-8bb1-7576191b4418-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:23 crc kubenswrapper[4802]: I1201 20:20:23.326622 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9757fe0-f447-4ac9-8bb1-7576191b4418-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:23 crc kubenswrapper[4802]: I1201 20:20:23.326640 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhzbf\" (UniqueName: \"kubernetes.io/projected/d9757fe0-f447-4ac9-8bb1-7576191b4418-kube-api-access-bhzbf\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:23 crc kubenswrapper[4802]: I1201 20:20:23.326657 4802 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9757fe0-f447-4ac9-8bb1-7576191b4418-logs\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:23 crc kubenswrapper[4802]: I1201 20:20:23.959634 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.018746 4802 generic.go:334] "Generic (PLEG): container finished" podID="a782c752-421f-4172-b3ef-5a2f6e772cd0" containerID="67c0d211c4ddfbac73101ba431d03e308fcea674191b50977d1a2ba6726543de" exitCode=0 Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.018814 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a782c752-421f-4172-b3ef-5a2f6e772cd0","Type":"ContainerDied","Data":"67c0d211c4ddfbac73101ba431d03e308fcea674191b50977d1a2ba6726543de"} Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.018844 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.018872 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a782c752-421f-4172-b3ef-5a2f6e772cd0","Type":"ContainerDied","Data":"8986a326c19cf3458d030d63d6d6dbb0efc04965184395a66e03b16441c73534"} Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.018900 4802 scope.go:117] "RemoveContainer" containerID="f67121a083a2a59cd5d0ed20d392e6fdca4e22d6251ccd2e2935c678389cc25a" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.018832 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.037188 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-config-data\") pod \"a782c752-421f-4172-b3ef-5a2f6e772cd0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.037322 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-sg-core-conf-yaml\") pod \"a782c752-421f-4172-b3ef-5a2f6e772cd0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.037349 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbftq\" (UniqueName: \"kubernetes.io/projected/a782c752-421f-4172-b3ef-5a2f6e772cd0-kube-api-access-vbftq\") pod \"a782c752-421f-4172-b3ef-5a2f6e772cd0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.037383 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-scripts\") pod \"a782c752-421f-4172-b3ef-5a2f6e772cd0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.037452 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-ceilometer-tls-certs\") pod \"a782c752-421f-4172-b3ef-5a2f6e772cd0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.037478 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a782c752-421f-4172-b3ef-5a2f6e772cd0-log-httpd\") pod \"a782c752-421f-4172-b3ef-5a2f6e772cd0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.037515 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-combined-ca-bundle\") pod \"a782c752-421f-4172-b3ef-5a2f6e772cd0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.037590 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a782c752-421f-4172-b3ef-5a2f6e772cd0-run-httpd\") pod \"a782c752-421f-4172-b3ef-5a2f6e772cd0\" (UID: \"a782c752-421f-4172-b3ef-5a2f6e772cd0\") " Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.037930 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a782c752-421f-4172-b3ef-5a2f6e772cd0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a782c752-421f-4172-b3ef-5a2f6e772cd0" (UID: "a782c752-421f-4172-b3ef-5a2f6e772cd0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.038089 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a782c752-421f-4172-b3ef-5a2f6e772cd0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a782c752-421f-4172-b3ef-5a2f6e772cd0" (UID: "a782c752-421f-4172-b3ef-5a2f6e772cd0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.038507 4802 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a782c752-421f-4172-b3ef-5a2f6e772cd0-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.038530 4802 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a782c752-421f-4172-b3ef-5a2f6e772cd0-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.045373 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a782c752-421f-4172-b3ef-5a2f6e772cd0-kube-api-access-vbftq" (OuterVolumeSpecName: "kube-api-access-vbftq") pod "a782c752-421f-4172-b3ef-5a2f6e772cd0" (UID: "a782c752-421f-4172-b3ef-5a2f6e772cd0"). InnerVolumeSpecName "kube-api-access-vbftq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.046535 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-scripts" (OuterVolumeSpecName: "scripts") pod "a782c752-421f-4172-b3ef-5a2f6e772cd0" (UID: "a782c752-421f-4172-b3ef-5a2f6e772cd0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.050904 4802 scope.go:117] "RemoveContainer" containerID="ba6eb83945068564450ea89f9e20c4251a23a8ccf30ad37343b4bce181160dfb" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.060414 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.070347 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.074950 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a782c752-421f-4172-b3ef-5a2f6e772cd0" (UID: "a782c752-421f-4172-b3ef-5a2f6e772cd0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.097250 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 20:20:24 crc kubenswrapper[4802]: E1201 20:20:24.097622 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9757fe0-f447-4ac9-8bb1-7576191b4418" containerName="nova-api-log" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.097634 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9757fe0-f447-4ac9-8bb1-7576191b4418" containerName="nova-api-log" Dec 01 20:20:24 crc kubenswrapper[4802]: E1201 20:20:24.097647 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a782c752-421f-4172-b3ef-5a2f6e772cd0" containerName="sg-core" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.097653 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="a782c752-421f-4172-b3ef-5a2f6e772cd0" containerName="sg-core" Dec 01 20:20:24 crc kubenswrapper[4802]: E1201 20:20:24.097668 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a782c752-421f-4172-b3ef-5a2f6e772cd0" containerName="proxy-httpd" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.097674 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="a782c752-421f-4172-b3ef-5a2f6e772cd0" containerName="proxy-httpd" Dec 01 20:20:24 crc kubenswrapper[4802]: E1201 20:20:24.097699 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a782c752-421f-4172-b3ef-5a2f6e772cd0" containerName="ceilometer-central-agent" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.097705 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="a782c752-421f-4172-b3ef-5a2f6e772cd0" containerName="ceilometer-central-agent" Dec 01 20:20:24 crc kubenswrapper[4802]: E1201 20:20:24.097719 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a782c752-421f-4172-b3ef-5a2f6e772cd0" containerName="ceilometer-notification-agent" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.097724 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="a782c752-421f-4172-b3ef-5a2f6e772cd0" containerName="ceilometer-notification-agent" Dec 01 20:20:24 crc kubenswrapper[4802]: E1201 20:20:24.097748 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9757fe0-f447-4ac9-8bb1-7576191b4418" containerName="nova-api-api" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.097754 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9757fe0-f447-4ac9-8bb1-7576191b4418" containerName="nova-api-api" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.097981 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9757fe0-f447-4ac9-8bb1-7576191b4418" containerName="nova-api-api" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.097992 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="a782c752-421f-4172-b3ef-5a2f6e772cd0" containerName="sg-core" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.098002 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="a782c752-421f-4172-b3ef-5a2f6e772cd0" containerName="ceilometer-notification-agent" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.098013 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="a782c752-421f-4172-b3ef-5a2f6e772cd0" containerName="proxy-httpd" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.098021 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="a782c752-421f-4172-b3ef-5a2f6e772cd0" containerName="ceilometer-central-agent" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.098033 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9757fe0-f447-4ac9-8bb1-7576191b4418" containerName="nova-api-log" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.099005 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.100147 4802 scope.go:117] "RemoveContainer" containerID="67c0d211c4ddfbac73101ba431d03e308fcea674191b50977d1a2ba6726543de" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.102580 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.108854 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.115721 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.119952 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.130053 4802 scope.go:117] "RemoveContainer" containerID="1926ad08a845c2a56803b986f9babbd8b6644d06a519e8b9335174ec14f2dd48" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.144713 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm7nq\" (UniqueName: \"kubernetes.io/projected/1fc2ed08-0642-4833-ae3b-1f717afa1244-kube-api-access-pm7nq\") pod \"nova-api-0\" (UID: \"1fc2ed08-0642-4833-ae3b-1f717afa1244\") " pod="openstack/nova-api-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.152349 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fc2ed08-0642-4833-ae3b-1f717afa1244-public-tls-certs\") pod \"nova-api-0\" (UID: \"1fc2ed08-0642-4833-ae3b-1f717afa1244\") " pod="openstack/nova-api-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.152542 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fc2ed08-0642-4833-ae3b-1f717afa1244-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1fc2ed08-0642-4833-ae3b-1f717afa1244\") " pod="openstack/nova-api-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.152599 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc2ed08-0642-4833-ae3b-1f717afa1244-config-data\") pod \"nova-api-0\" (UID: \"1fc2ed08-0642-4833-ae3b-1f717afa1244\") " pod="openstack/nova-api-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.152642 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fc2ed08-0642-4833-ae3b-1f717afa1244-logs\") pod \"nova-api-0\" (UID: \"1fc2ed08-0642-4833-ae3b-1f717afa1244\") " pod="openstack/nova-api-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.152668 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc2ed08-0642-4833-ae3b-1f717afa1244-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1fc2ed08-0642-4833-ae3b-1f717afa1244\") " pod="openstack/nova-api-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.152740 4802 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.152752 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbftq\" (UniqueName: \"kubernetes.io/projected/a782c752-421f-4172-b3ef-5a2f6e772cd0-kube-api-access-vbftq\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.152779 4802 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.190509 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "a782c752-421f-4172-b3ef-5a2f6e772cd0" (UID: "a782c752-421f-4172-b3ef-5a2f6e772cd0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.191254 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a782c752-421f-4172-b3ef-5a2f6e772cd0" (UID: "a782c752-421f-4172-b3ef-5a2f6e772cd0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.206398 4802 scope.go:117] "RemoveContainer" containerID="f67121a083a2a59cd5d0ed20d392e6fdca4e22d6251ccd2e2935c678389cc25a" Dec 01 20:20:24 crc kubenswrapper[4802]: E1201 20:20:24.212323 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f67121a083a2a59cd5d0ed20d392e6fdca4e22d6251ccd2e2935c678389cc25a\": container with ID starting with f67121a083a2a59cd5d0ed20d392e6fdca4e22d6251ccd2e2935c678389cc25a not found: ID does not exist" containerID="f67121a083a2a59cd5d0ed20d392e6fdca4e22d6251ccd2e2935c678389cc25a" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.212389 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f67121a083a2a59cd5d0ed20d392e6fdca4e22d6251ccd2e2935c678389cc25a"} err="failed to get container status \"f67121a083a2a59cd5d0ed20d392e6fdca4e22d6251ccd2e2935c678389cc25a\": rpc error: code = NotFound desc = could not find container \"f67121a083a2a59cd5d0ed20d392e6fdca4e22d6251ccd2e2935c678389cc25a\": container with ID starting with f67121a083a2a59cd5d0ed20d392e6fdca4e22d6251ccd2e2935c678389cc25a not found: ID does not exist" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.212413 4802 scope.go:117] "RemoveContainer" containerID="ba6eb83945068564450ea89f9e20c4251a23a8ccf30ad37343b4bce181160dfb" Dec 01 20:20:24 crc kubenswrapper[4802]: E1201 20:20:24.222376 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba6eb83945068564450ea89f9e20c4251a23a8ccf30ad37343b4bce181160dfb\": container with ID starting with ba6eb83945068564450ea89f9e20c4251a23a8ccf30ad37343b4bce181160dfb not found: ID does not exist" containerID="ba6eb83945068564450ea89f9e20c4251a23a8ccf30ad37343b4bce181160dfb" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.222446 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba6eb83945068564450ea89f9e20c4251a23a8ccf30ad37343b4bce181160dfb"} err="failed to get container status \"ba6eb83945068564450ea89f9e20c4251a23a8ccf30ad37343b4bce181160dfb\": rpc error: code = NotFound desc = could not find container \"ba6eb83945068564450ea89f9e20c4251a23a8ccf30ad37343b4bce181160dfb\": container with ID starting with ba6eb83945068564450ea89f9e20c4251a23a8ccf30ad37343b4bce181160dfb not found: ID does not exist" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.222471 4802 scope.go:117] "RemoveContainer" containerID="67c0d211c4ddfbac73101ba431d03e308fcea674191b50977d1a2ba6726543de" Dec 01 20:20:24 crc kubenswrapper[4802]: E1201 20:20:24.229325 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67c0d211c4ddfbac73101ba431d03e308fcea674191b50977d1a2ba6726543de\": container with ID starting with 67c0d211c4ddfbac73101ba431d03e308fcea674191b50977d1a2ba6726543de not found: ID does not exist" containerID="67c0d211c4ddfbac73101ba431d03e308fcea674191b50977d1a2ba6726543de" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.229380 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67c0d211c4ddfbac73101ba431d03e308fcea674191b50977d1a2ba6726543de"} err="failed to get container status \"67c0d211c4ddfbac73101ba431d03e308fcea674191b50977d1a2ba6726543de\": rpc error: code = NotFound desc = could not find container \"67c0d211c4ddfbac73101ba431d03e308fcea674191b50977d1a2ba6726543de\": container with ID starting with 67c0d211c4ddfbac73101ba431d03e308fcea674191b50977d1a2ba6726543de not found: ID does not exist" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.229403 4802 scope.go:117] "RemoveContainer" containerID="1926ad08a845c2a56803b986f9babbd8b6644d06a519e8b9335174ec14f2dd48" Dec 01 20:20:24 crc kubenswrapper[4802]: E1201 20:20:24.238362 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1926ad08a845c2a56803b986f9babbd8b6644d06a519e8b9335174ec14f2dd48\": container with ID starting with 1926ad08a845c2a56803b986f9babbd8b6644d06a519e8b9335174ec14f2dd48 not found: ID does not exist" containerID="1926ad08a845c2a56803b986f9babbd8b6644d06a519e8b9335174ec14f2dd48" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.238412 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1926ad08a845c2a56803b986f9babbd8b6644d06a519e8b9335174ec14f2dd48"} err="failed to get container status \"1926ad08a845c2a56803b986f9babbd8b6644d06a519e8b9335174ec14f2dd48\": rpc error: code = NotFound desc = could not find container \"1926ad08a845c2a56803b986f9babbd8b6644d06a519e8b9335174ec14f2dd48\": container with ID starting with 1926ad08a845c2a56803b986f9babbd8b6644d06a519e8b9335174ec14f2dd48 not found: ID does not exist" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.257397 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fc2ed08-0642-4833-ae3b-1f717afa1244-public-tls-certs\") pod \"nova-api-0\" (UID: \"1fc2ed08-0642-4833-ae3b-1f717afa1244\") " pod="openstack/nova-api-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.257499 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fc2ed08-0642-4833-ae3b-1f717afa1244-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1fc2ed08-0642-4833-ae3b-1f717afa1244\") " pod="openstack/nova-api-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.257546 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc2ed08-0642-4833-ae3b-1f717afa1244-config-data\") pod \"nova-api-0\" (UID: \"1fc2ed08-0642-4833-ae3b-1f717afa1244\") " pod="openstack/nova-api-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.257567 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fc2ed08-0642-4833-ae3b-1f717afa1244-logs\") pod \"nova-api-0\" (UID: \"1fc2ed08-0642-4833-ae3b-1f717afa1244\") " pod="openstack/nova-api-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.257587 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc2ed08-0642-4833-ae3b-1f717afa1244-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1fc2ed08-0642-4833-ae3b-1f717afa1244\") " pod="openstack/nova-api-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.257613 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm7nq\" (UniqueName: \"kubernetes.io/projected/1fc2ed08-0642-4833-ae3b-1f717afa1244-kube-api-access-pm7nq\") pod \"nova-api-0\" (UID: \"1fc2ed08-0642-4833-ae3b-1f717afa1244\") " pod="openstack/nova-api-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.257706 4802 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.257718 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.262579 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fc2ed08-0642-4833-ae3b-1f717afa1244-logs\") pod \"nova-api-0\" (UID: \"1fc2ed08-0642-4833-ae3b-1f717afa1244\") " pod="openstack/nova-api-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.270174 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fc2ed08-0642-4833-ae3b-1f717afa1244-public-tls-certs\") pod \"nova-api-0\" (UID: \"1fc2ed08-0642-4833-ae3b-1f717afa1244\") " pod="openstack/nova-api-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.284622 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc2ed08-0642-4833-ae3b-1f717afa1244-config-data\") pod \"nova-api-0\" (UID: \"1fc2ed08-0642-4833-ae3b-1f717afa1244\") " pod="openstack/nova-api-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.287306 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc2ed08-0642-4833-ae3b-1f717afa1244-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1fc2ed08-0642-4833-ae3b-1f717afa1244\") " pod="openstack/nova-api-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.287815 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fc2ed08-0642-4833-ae3b-1f717afa1244-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1fc2ed08-0642-4833-ae3b-1f717afa1244\") " pod="openstack/nova-api-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.292815 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm7nq\" (UniqueName: \"kubernetes.io/projected/1fc2ed08-0642-4833-ae3b-1f717afa1244-kube-api-access-pm7nq\") pod \"nova-api-0\" (UID: \"1fc2ed08-0642-4833-ae3b-1f717afa1244\") " pod="openstack/nova-api-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.305682 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-config-data" (OuterVolumeSpecName: "config-data") pod "a782c752-421f-4172-b3ef-5a2f6e772cd0" (UID: "a782c752-421f-4172-b3ef-5a2f6e772cd0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.311088 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.332136 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.362061 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a782c752-421f-4172-b3ef-5a2f6e772cd0-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.410988 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.430817 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.432813 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.441650 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.444091 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.449388 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.449842 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.449879 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.450077 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.571680 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " pod="openstack/ceilometer-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.572085 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d1dda46-a58e-449a-b955-fe29d42e657b-log-httpd\") pod \"ceilometer-0\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " pod="openstack/ceilometer-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.572183 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-scripts\") pod \"ceilometer-0\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " pod="openstack/ceilometer-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.572324 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7dcv\" (UniqueName: \"kubernetes.io/projected/8d1dda46-a58e-449a-b955-fe29d42e657b-kube-api-access-j7dcv\") pod \"ceilometer-0\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " pod="openstack/ceilometer-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.572368 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d1dda46-a58e-449a-b955-fe29d42e657b-run-httpd\") pod \"ceilometer-0\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " pod="openstack/ceilometer-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.572955 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " pod="openstack/ceilometer-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.573036 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-config-data\") pod \"ceilometer-0\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " pod="openstack/ceilometer-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.573053 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " pod="openstack/ceilometer-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.674496 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-config-data\") pod \"ceilometer-0\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " pod="openstack/ceilometer-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.674541 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " pod="openstack/ceilometer-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.674568 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " pod="openstack/ceilometer-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.674644 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d1dda46-a58e-449a-b955-fe29d42e657b-log-httpd\") pod \"ceilometer-0\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " pod="openstack/ceilometer-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.674693 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-scripts\") pod \"ceilometer-0\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " pod="openstack/ceilometer-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.674722 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7dcv\" (UniqueName: \"kubernetes.io/projected/8d1dda46-a58e-449a-b955-fe29d42e657b-kube-api-access-j7dcv\") pod \"ceilometer-0\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " pod="openstack/ceilometer-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.674743 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d1dda46-a58e-449a-b955-fe29d42e657b-run-httpd\") pod \"ceilometer-0\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " pod="openstack/ceilometer-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.674762 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " pod="openstack/ceilometer-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.675896 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d1dda46-a58e-449a-b955-fe29d42e657b-log-httpd\") pod \"ceilometer-0\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " pod="openstack/ceilometer-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.677237 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d1dda46-a58e-449a-b955-fe29d42e657b-run-httpd\") pod \"ceilometer-0\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " pod="openstack/ceilometer-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.681082 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-scripts\") pod \"ceilometer-0\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " pod="openstack/ceilometer-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.681320 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " pod="openstack/ceilometer-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.681792 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " pod="openstack/ceilometer-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.689642 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " pod="openstack/ceilometer-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.691713 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7dcv\" (UniqueName: \"kubernetes.io/projected/8d1dda46-a58e-449a-b955-fe29d42e657b-kube-api-access-j7dcv\") pod \"ceilometer-0\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " pod="openstack/ceilometer-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.693762 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-config-data\") pod \"ceilometer-0\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " pod="openstack/ceilometer-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.730250 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a782c752-421f-4172-b3ef-5a2f6e772cd0" path="/var/lib/kubelet/pods/a782c752-421f-4172-b3ef-5a2f6e772cd0/volumes" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.731016 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9757fe0-f447-4ac9-8bb1-7576191b4418" path="/var/lib/kubelet/pods/d9757fe0-f447-4ac9-8bb1-7576191b4418/volumes" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.850430 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:20:24 crc kubenswrapper[4802]: I1201 20:20:24.914802 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 20:20:25 crc kubenswrapper[4802]: I1201 20:20:25.031602 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1fc2ed08-0642-4833-ae3b-1f717afa1244","Type":"ContainerStarted","Data":"175232f7b661682157fd7201ec4af2059c339bc2a7979490adb48be580f4d7f0"} Dec 01 20:20:25 crc kubenswrapper[4802]: I1201 20:20:25.053436 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 01 20:20:25 crc kubenswrapper[4802]: I1201 20:20:25.272807 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-wb4qn"] Dec 01 20:20:25 crc kubenswrapper[4802]: I1201 20:20:25.274262 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wb4qn" Dec 01 20:20:25 crc kubenswrapper[4802]: I1201 20:20:25.276123 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 01 20:20:25 crc kubenswrapper[4802]: I1201 20:20:25.276346 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 01 20:20:25 crc kubenswrapper[4802]: I1201 20:20:25.296680 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-wb4qn"] Dec 01 20:20:25 crc kubenswrapper[4802]: I1201 20:20:25.331727 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:20:25 crc kubenswrapper[4802]: W1201 20:20:25.337261 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d1dda46_a58e_449a_b955_fe29d42e657b.slice/crio-9514e36a9f561c1a6ad518e1c5a59f6ced9540c3e26c2c6e299f2a83b0aa05be WatchSource:0}: Error finding container 9514e36a9f561c1a6ad518e1c5a59f6ced9540c3e26c2c6e299f2a83b0aa05be: Status 404 returned error can't find the container with id 9514e36a9f561c1a6ad518e1c5a59f6ced9540c3e26c2c6e299f2a83b0aa05be Dec 01 20:20:25 crc kubenswrapper[4802]: I1201 20:20:25.388811 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a77beefc-0880-4f76-b2d4-4b3b1b3c42e9-config-data\") pod \"nova-cell1-cell-mapping-wb4qn\" (UID: \"a77beefc-0880-4f76-b2d4-4b3b1b3c42e9\") " pod="openstack/nova-cell1-cell-mapping-wb4qn" Dec 01 20:20:25 crc kubenswrapper[4802]: I1201 20:20:25.388880 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a77beefc-0880-4f76-b2d4-4b3b1b3c42e9-scripts\") pod \"nova-cell1-cell-mapping-wb4qn\" (UID: \"a77beefc-0880-4f76-b2d4-4b3b1b3c42e9\") " pod="openstack/nova-cell1-cell-mapping-wb4qn" Dec 01 20:20:25 crc kubenswrapper[4802]: I1201 20:20:25.389005 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a77beefc-0880-4f76-b2d4-4b3b1b3c42e9-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wb4qn\" (UID: \"a77beefc-0880-4f76-b2d4-4b3b1b3c42e9\") " pod="openstack/nova-cell1-cell-mapping-wb4qn" Dec 01 20:20:25 crc kubenswrapper[4802]: I1201 20:20:25.389087 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hktmc\" (UniqueName: \"kubernetes.io/projected/a77beefc-0880-4f76-b2d4-4b3b1b3c42e9-kube-api-access-hktmc\") pod \"nova-cell1-cell-mapping-wb4qn\" (UID: \"a77beefc-0880-4f76-b2d4-4b3b1b3c42e9\") " pod="openstack/nova-cell1-cell-mapping-wb4qn" Dec 01 20:20:25 crc kubenswrapper[4802]: I1201 20:20:25.490586 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a77beefc-0880-4f76-b2d4-4b3b1b3c42e9-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wb4qn\" (UID: \"a77beefc-0880-4f76-b2d4-4b3b1b3c42e9\") " pod="openstack/nova-cell1-cell-mapping-wb4qn" Dec 01 20:20:25 crc kubenswrapper[4802]: I1201 20:20:25.490724 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hktmc\" (UniqueName: \"kubernetes.io/projected/a77beefc-0880-4f76-b2d4-4b3b1b3c42e9-kube-api-access-hktmc\") pod \"nova-cell1-cell-mapping-wb4qn\" (UID: \"a77beefc-0880-4f76-b2d4-4b3b1b3c42e9\") " pod="openstack/nova-cell1-cell-mapping-wb4qn" Dec 01 20:20:25 crc kubenswrapper[4802]: I1201 20:20:25.490875 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a77beefc-0880-4f76-b2d4-4b3b1b3c42e9-config-data\") pod \"nova-cell1-cell-mapping-wb4qn\" (UID: \"a77beefc-0880-4f76-b2d4-4b3b1b3c42e9\") " pod="openstack/nova-cell1-cell-mapping-wb4qn" Dec 01 20:20:25 crc kubenswrapper[4802]: I1201 20:20:25.490915 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a77beefc-0880-4f76-b2d4-4b3b1b3c42e9-scripts\") pod \"nova-cell1-cell-mapping-wb4qn\" (UID: \"a77beefc-0880-4f76-b2d4-4b3b1b3c42e9\") " pod="openstack/nova-cell1-cell-mapping-wb4qn" Dec 01 20:20:25 crc kubenswrapper[4802]: I1201 20:20:25.495473 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a77beefc-0880-4f76-b2d4-4b3b1b3c42e9-config-data\") pod \"nova-cell1-cell-mapping-wb4qn\" (UID: \"a77beefc-0880-4f76-b2d4-4b3b1b3c42e9\") " pod="openstack/nova-cell1-cell-mapping-wb4qn" Dec 01 20:20:25 crc kubenswrapper[4802]: I1201 20:20:25.497165 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a77beefc-0880-4f76-b2d4-4b3b1b3c42e9-scripts\") pod \"nova-cell1-cell-mapping-wb4qn\" (UID: \"a77beefc-0880-4f76-b2d4-4b3b1b3c42e9\") " pod="openstack/nova-cell1-cell-mapping-wb4qn" Dec 01 20:20:25 crc kubenswrapper[4802]: I1201 20:20:25.503525 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a77beefc-0880-4f76-b2d4-4b3b1b3c42e9-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wb4qn\" (UID: \"a77beefc-0880-4f76-b2d4-4b3b1b3c42e9\") " pod="openstack/nova-cell1-cell-mapping-wb4qn" Dec 01 20:20:25 crc kubenswrapper[4802]: I1201 20:20:25.509030 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hktmc\" (UniqueName: \"kubernetes.io/projected/a77beefc-0880-4f76-b2d4-4b3b1b3c42e9-kube-api-access-hktmc\") pod \"nova-cell1-cell-mapping-wb4qn\" (UID: \"a77beefc-0880-4f76-b2d4-4b3b1b3c42e9\") " pod="openstack/nova-cell1-cell-mapping-wb4qn" Dec 01 20:20:25 crc kubenswrapper[4802]: I1201 20:20:25.632152 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wb4qn" Dec 01 20:20:26 crc kubenswrapper[4802]: I1201 20:20:26.043401 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1fc2ed08-0642-4833-ae3b-1f717afa1244","Type":"ContainerStarted","Data":"e89fb39b15934809f3440d94ba26d02548eb90ca9b01d84c9605418f44984bff"} Dec 01 20:20:26 crc kubenswrapper[4802]: I1201 20:20:26.044659 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1fc2ed08-0642-4833-ae3b-1f717afa1244","Type":"ContainerStarted","Data":"bafece20b2cc9e3e69404cc204063d4e337a91fe9b69aed81c0b355f743d4c99"} Dec 01 20:20:26 crc kubenswrapper[4802]: I1201 20:20:26.046378 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d1dda46-a58e-449a-b955-fe29d42e657b","Type":"ContainerStarted","Data":"eda06ba1783153159cc3aef59e91124ea4a804d55ded207da2e288949bc96a3d"} Dec 01 20:20:26 crc kubenswrapper[4802]: I1201 20:20:26.046493 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d1dda46-a58e-449a-b955-fe29d42e657b","Type":"ContainerStarted","Data":"9514e36a9f561c1a6ad518e1c5a59f6ced9540c3e26c2c6e299f2a83b0aa05be"} Dec 01 20:20:26 crc kubenswrapper[4802]: I1201 20:20:26.068443 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.068426111 podStartE2EDuration="2.068426111s" podCreationTimestamp="2025-12-01 20:20:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:20:26.062486918 +0000 UTC m=+1447.625046559" watchObservedRunningTime="2025-12-01 20:20:26.068426111 +0000 UTC m=+1447.630985742" Dec 01 20:20:26 crc kubenswrapper[4802]: W1201 20:20:26.103936 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda77beefc_0880_4f76_b2d4_4b3b1b3c42e9.slice/crio-e417e115b665a7f7767cb2a2f5f712c40500da65bebbaddabb0cd33d369726fe WatchSource:0}: Error finding container e417e115b665a7f7767cb2a2f5f712c40500da65bebbaddabb0cd33d369726fe: Status 404 returned error can't find the container with id e417e115b665a7f7767cb2a2f5f712c40500da65bebbaddabb0cd33d369726fe Dec 01 20:20:26 crc kubenswrapper[4802]: I1201 20:20:26.104438 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-wb4qn"] Dec 01 20:20:26 crc kubenswrapper[4802]: I1201 20:20:26.508085 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b856c5697-frqg6" Dec 01 20:20:26 crc kubenswrapper[4802]: I1201 20:20:26.580700 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-jptns"] Dec 01 20:20:26 crc kubenswrapper[4802]: I1201 20:20:26.581152 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-566b5b7845-jptns" podUID="3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77" containerName="dnsmasq-dns" containerID="cri-o://3880fcb5137c8753165a638efde9a0ef144b59ee00f80d5b9745fe8731b8f34e" gracePeriod=10 Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.097747 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-jptns" Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.125519 4802 generic.go:334] "Generic (PLEG): container finished" podID="3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77" containerID="3880fcb5137c8753165a638efde9a0ef144b59ee00f80d5b9745fe8731b8f34e" exitCode=0 Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.125654 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-jptns" Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.126178 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-jptns" event={"ID":"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77","Type":"ContainerDied","Data":"3880fcb5137c8753165a638efde9a0ef144b59ee00f80d5b9745fe8731b8f34e"} Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.126224 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-jptns" event={"ID":"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77","Type":"ContainerDied","Data":"1fa123b3ed1ec5796dc2abf9c397e80f677a628e5c480069188c9809b3fe4d74"} Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.126240 4802 scope.go:117] "RemoveContainer" containerID="3880fcb5137c8753165a638efde9a0ef144b59ee00f80d5b9745fe8731b8f34e" Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.134539 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wb4qn" event={"ID":"a77beefc-0880-4f76-b2d4-4b3b1b3c42e9","Type":"ContainerStarted","Data":"262fe17bf5ba9453ade40441101279313abb7f5a6d1f8c92df72c8ab42e97d8c"} Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.134583 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wb4qn" event={"ID":"a77beefc-0880-4f76-b2d4-4b3b1b3c42e9","Type":"ContainerStarted","Data":"e417e115b665a7f7767cb2a2f5f712c40500da65bebbaddabb0cd33d369726fe"} Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.142530 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d1dda46-a58e-449a-b955-fe29d42e657b","Type":"ContainerStarted","Data":"5d16d1356d437d9b54877a0ee0580037560b09884fa603923a333f8f50593ebb"} Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.156413 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-wb4qn" podStartSLOduration=2.156388554 podStartE2EDuration="2.156388554s" podCreationTimestamp="2025-12-01 20:20:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:20:27.152627317 +0000 UTC m=+1448.715186978" watchObservedRunningTime="2025-12-01 20:20:27.156388554 +0000 UTC m=+1448.718948205" Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.157523 4802 scope.go:117] "RemoveContainer" containerID="226fdb1c67d12dfdad0ec2abbe87cc5361b24598a0f081753aede42b382fd5a2" Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.181154 4802 scope.go:117] "RemoveContainer" containerID="3880fcb5137c8753165a638efde9a0ef144b59ee00f80d5b9745fe8731b8f34e" Dec 01 20:20:27 crc kubenswrapper[4802]: E1201 20:20:27.181700 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3880fcb5137c8753165a638efde9a0ef144b59ee00f80d5b9745fe8731b8f34e\": container with ID starting with 3880fcb5137c8753165a638efde9a0ef144b59ee00f80d5b9745fe8731b8f34e not found: ID does not exist" containerID="3880fcb5137c8753165a638efde9a0ef144b59ee00f80d5b9745fe8731b8f34e" Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.181829 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3880fcb5137c8753165a638efde9a0ef144b59ee00f80d5b9745fe8731b8f34e"} err="failed to get container status \"3880fcb5137c8753165a638efde9a0ef144b59ee00f80d5b9745fe8731b8f34e\": rpc error: code = NotFound desc = could not find container \"3880fcb5137c8753165a638efde9a0ef144b59ee00f80d5b9745fe8731b8f34e\": container with ID starting with 3880fcb5137c8753165a638efde9a0ef144b59ee00f80d5b9745fe8731b8f34e not found: ID does not exist" Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.181934 4802 scope.go:117] "RemoveContainer" containerID="226fdb1c67d12dfdad0ec2abbe87cc5361b24598a0f081753aede42b382fd5a2" Dec 01 20:20:27 crc kubenswrapper[4802]: E1201 20:20:27.182293 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"226fdb1c67d12dfdad0ec2abbe87cc5361b24598a0f081753aede42b382fd5a2\": container with ID starting with 226fdb1c67d12dfdad0ec2abbe87cc5361b24598a0f081753aede42b382fd5a2 not found: ID does not exist" containerID="226fdb1c67d12dfdad0ec2abbe87cc5361b24598a0f081753aede42b382fd5a2" Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.182385 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"226fdb1c67d12dfdad0ec2abbe87cc5361b24598a0f081753aede42b382fd5a2"} err="failed to get container status \"226fdb1c67d12dfdad0ec2abbe87cc5361b24598a0f081753aede42b382fd5a2\": rpc error: code = NotFound desc = could not find container \"226fdb1c67d12dfdad0ec2abbe87cc5361b24598a0f081753aede42b382fd5a2\": container with ID starting with 226fdb1c67d12dfdad0ec2abbe87cc5361b24598a0f081753aede42b382fd5a2 not found: ID does not exist" Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.223087 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-config\") pod \"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77\" (UID: \"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77\") " Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.223221 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-ovsdbserver-sb\") pod \"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77\" (UID: \"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77\") " Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.223289 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-dns-svc\") pod \"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77\" (UID: \"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77\") " Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.223310 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llz7v\" (UniqueName: \"kubernetes.io/projected/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-kube-api-access-llz7v\") pod \"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77\" (UID: \"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77\") " Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.223343 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-ovsdbserver-nb\") pod \"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77\" (UID: \"3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77\") " Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.249742 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-kube-api-access-llz7v" (OuterVolumeSpecName: "kube-api-access-llz7v") pod "3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77" (UID: "3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77"). InnerVolumeSpecName "kube-api-access-llz7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.281149 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77" (UID: "3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.287375 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-config" (OuterVolumeSpecName: "config") pod "3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77" (UID: "3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.294163 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77" (UID: "3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.307811 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77" (UID: "3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.326212 4802 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.326481 4802 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.326578 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llz7v\" (UniqueName: \"kubernetes.io/projected/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-kube-api-access-llz7v\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.326670 4802 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.326763 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.461086 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-jptns"] Dec 01 20:20:27 crc kubenswrapper[4802]: I1201 20:20:27.468831 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-jptns"] Dec 01 20:20:28 crc kubenswrapper[4802]: I1201 20:20:28.731830 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77" path="/var/lib/kubelet/pods/3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77/volumes" Dec 01 20:20:29 crc kubenswrapper[4802]: I1201 20:20:29.176635 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d1dda46-a58e-449a-b955-fe29d42e657b","Type":"ContainerStarted","Data":"5a83d2ae33500dd1c207401a9de44606eb613a1bc6b09d691bd98fe298250869"} Dec 01 20:20:31 crc kubenswrapper[4802]: I1201 20:20:31.199865 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d1dda46-a58e-449a-b955-fe29d42e657b","Type":"ContainerStarted","Data":"9706577de228b94ec0ad1901203cbd57c1729e2c9fbedfa85764d8e1d6bbaa8f"} Dec 01 20:20:31 crc kubenswrapper[4802]: I1201 20:20:31.200179 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 20:20:31 crc kubenswrapper[4802]: I1201 20:20:31.230463 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.7635104510000001 podStartE2EDuration="7.230446592s" podCreationTimestamp="2025-12-01 20:20:24 +0000 UTC" firstStartedPulling="2025-12-01 20:20:25.339696924 +0000 UTC m=+1446.902256565" lastFinishedPulling="2025-12-01 20:20:30.806633075 +0000 UTC m=+1452.369192706" observedRunningTime="2025-12-01 20:20:31.225008013 +0000 UTC m=+1452.787567654" watchObservedRunningTime="2025-12-01 20:20:31.230446592 +0000 UTC m=+1452.793006233" Dec 01 20:20:32 crc kubenswrapper[4802]: I1201 20:20:32.211221 4802 generic.go:334] "Generic (PLEG): container finished" podID="a77beefc-0880-4f76-b2d4-4b3b1b3c42e9" containerID="262fe17bf5ba9453ade40441101279313abb7f5a6d1f8c92df72c8ab42e97d8c" exitCode=0 Dec 01 20:20:32 crc kubenswrapper[4802]: I1201 20:20:32.211301 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wb4qn" event={"ID":"a77beefc-0880-4f76-b2d4-4b3b1b3c42e9","Type":"ContainerDied","Data":"262fe17bf5ba9453ade40441101279313abb7f5a6d1f8c92df72c8ab42e97d8c"} Dec 01 20:20:33 crc kubenswrapper[4802]: I1201 20:20:33.579461 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wb4qn" Dec 01 20:20:33 crc kubenswrapper[4802]: I1201 20:20:33.653738 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a77beefc-0880-4f76-b2d4-4b3b1b3c42e9-config-data\") pod \"a77beefc-0880-4f76-b2d4-4b3b1b3c42e9\" (UID: \"a77beefc-0880-4f76-b2d4-4b3b1b3c42e9\") " Dec 01 20:20:33 crc kubenswrapper[4802]: I1201 20:20:33.653847 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a77beefc-0880-4f76-b2d4-4b3b1b3c42e9-combined-ca-bundle\") pod \"a77beefc-0880-4f76-b2d4-4b3b1b3c42e9\" (UID: \"a77beefc-0880-4f76-b2d4-4b3b1b3c42e9\") " Dec 01 20:20:33 crc kubenswrapper[4802]: I1201 20:20:33.653977 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hktmc\" (UniqueName: \"kubernetes.io/projected/a77beefc-0880-4f76-b2d4-4b3b1b3c42e9-kube-api-access-hktmc\") pod \"a77beefc-0880-4f76-b2d4-4b3b1b3c42e9\" (UID: \"a77beefc-0880-4f76-b2d4-4b3b1b3c42e9\") " Dec 01 20:20:33 crc kubenswrapper[4802]: I1201 20:20:33.654040 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a77beefc-0880-4f76-b2d4-4b3b1b3c42e9-scripts\") pod \"a77beefc-0880-4f76-b2d4-4b3b1b3c42e9\" (UID: \"a77beefc-0880-4f76-b2d4-4b3b1b3c42e9\") " Dec 01 20:20:33 crc kubenswrapper[4802]: I1201 20:20:33.663007 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a77beefc-0880-4f76-b2d4-4b3b1b3c42e9-scripts" (OuterVolumeSpecName: "scripts") pod "a77beefc-0880-4f76-b2d4-4b3b1b3c42e9" (UID: "a77beefc-0880-4f76-b2d4-4b3b1b3c42e9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:20:33 crc kubenswrapper[4802]: I1201 20:20:33.663042 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a77beefc-0880-4f76-b2d4-4b3b1b3c42e9-kube-api-access-hktmc" (OuterVolumeSpecName: "kube-api-access-hktmc") pod "a77beefc-0880-4f76-b2d4-4b3b1b3c42e9" (UID: "a77beefc-0880-4f76-b2d4-4b3b1b3c42e9"). InnerVolumeSpecName "kube-api-access-hktmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:20:33 crc kubenswrapper[4802]: I1201 20:20:33.693386 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a77beefc-0880-4f76-b2d4-4b3b1b3c42e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a77beefc-0880-4f76-b2d4-4b3b1b3c42e9" (UID: "a77beefc-0880-4f76-b2d4-4b3b1b3c42e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:20:33 crc kubenswrapper[4802]: I1201 20:20:33.694457 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a77beefc-0880-4f76-b2d4-4b3b1b3c42e9-config-data" (OuterVolumeSpecName: "config-data") pod "a77beefc-0880-4f76-b2d4-4b3b1b3c42e9" (UID: "a77beefc-0880-4f76-b2d4-4b3b1b3c42e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:20:33 crc kubenswrapper[4802]: I1201 20:20:33.758048 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a77beefc-0880-4f76-b2d4-4b3b1b3c42e9-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:33 crc kubenswrapper[4802]: I1201 20:20:33.758080 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a77beefc-0880-4f76-b2d4-4b3b1b3c42e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:33 crc kubenswrapper[4802]: I1201 20:20:33.758092 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hktmc\" (UniqueName: \"kubernetes.io/projected/a77beefc-0880-4f76-b2d4-4b3b1b3c42e9-kube-api-access-hktmc\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:33 crc kubenswrapper[4802]: I1201 20:20:33.758101 4802 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a77beefc-0880-4f76-b2d4-4b3b1b3c42e9-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:34 crc kubenswrapper[4802]: I1201 20:20:34.230608 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wb4qn" event={"ID":"a77beefc-0880-4f76-b2d4-4b3b1b3c42e9","Type":"ContainerDied","Data":"e417e115b665a7f7767cb2a2f5f712c40500da65bebbaddabb0cd33d369726fe"} Dec 01 20:20:34 crc kubenswrapper[4802]: I1201 20:20:34.230951 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e417e115b665a7f7767cb2a2f5f712c40500da65bebbaddabb0cd33d369726fe" Dec 01 20:20:34 crc kubenswrapper[4802]: I1201 20:20:34.230629 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wb4qn" Dec 01 20:20:34 crc kubenswrapper[4802]: I1201 20:20:34.422547 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 20:20:34 crc kubenswrapper[4802]: I1201 20:20:34.422837 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="30203d04-83ed-42d6-9f9e-e7db08604e70" containerName="nova-scheduler-scheduler" containerID="cri-o://577cff729acb0bd11fab8296589ff75931b69ace57247e4afb9b34552232dc30" gracePeriod=30 Dec 01 20:20:34 crc kubenswrapper[4802]: I1201 20:20:34.434274 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 20:20:34 crc kubenswrapper[4802]: I1201 20:20:34.434359 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 20:20:34 crc kubenswrapper[4802]: E1201 20:20:34.441455 4802 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="577cff729acb0bd11fab8296589ff75931b69ace57247e4afb9b34552232dc30" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 20:20:34 crc kubenswrapper[4802]: E1201 20:20:34.445032 4802 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="577cff729acb0bd11fab8296589ff75931b69ace57247e4afb9b34552232dc30" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 20:20:34 crc kubenswrapper[4802]: E1201 20:20:34.447364 4802 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="577cff729acb0bd11fab8296589ff75931b69ace57247e4afb9b34552232dc30" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 20:20:34 crc kubenswrapper[4802]: E1201 20:20:34.447429 4802 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="30203d04-83ed-42d6-9f9e-e7db08604e70" containerName="nova-scheduler-scheduler" Dec 01 20:20:34 crc kubenswrapper[4802]: I1201 20:20:34.458259 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 20:20:34 crc kubenswrapper[4802]: I1201 20:20:34.468884 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 20:20:34 crc kubenswrapper[4802]: I1201 20:20:34.469170 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2f6e2953-0b0e-4e75-b450-03e3faf4bdf6" containerName="nova-metadata-log" containerID="cri-o://f02630047b4c7ca6c9c3803a4106e275e1745e409bb993ce1a1fcb2ebebc2f34" gracePeriod=30 Dec 01 20:20:34 crc kubenswrapper[4802]: I1201 20:20:34.469375 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2f6e2953-0b0e-4e75-b450-03e3faf4bdf6" containerName="nova-metadata-metadata" containerID="cri-o://9f274b5d6b00a397f160480aac98e75341f613bd21121492dad893dfae438cd4" gracePeriod=30 Dec 01 20:20:35 crc kubenswrapper[4802]: I1201 20:20:35.240289 4802 generic.go:334] "Generic (PLEG): container finished" podID="2f6e2953-0b0e-4e75-b450-03e3faf4bdf6" containerID="f02630047b4c7ca6c9c3803a4106e275e1745e409bb993ce1a1fcb2ebebc2f34" exitCode=143 Dec 01 20:20:35 crc kubenswrapper[4802]: I1201 20:20:35.240702 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1fc2ed08-0642-4833-ae3b-1f717afa1244" containerName="nova-api-log" containerID="cri-o://bafece20b2cc9e3e69404cc204063d4e337a91fe9b69aed81c0b355f743d4c99" gracePeriod=30 Dec 01 20:20:35 crc kubenswrapper[4802]: I1201 20:20:35.240377 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6","Type":"ContainerDied","Data":"f02630047b4c7ca6c9c3803a4106e275e1745e409bb993ce1a1fcb2ebebc2f34"} Dec 01 20:20:35 crc kubenswrapper[4802]: I1201 20:20:35.241063 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1fc2ed08-0642-4833-ae3b-1f717afa1244" containerName="nova-api-api" containerID="cri-o://e89fb39b15934809f3440d94ba26d02548eb90ca9b01d84c9605418f44984bff" gracePeriod=30 Dec 01 20:20:35 crc kubenswrapper[4802]: I1201 20:20:35.244884 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1fc2ed08-0642-4833-ae3b-1f717afa1244" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.182:8774/\": EOF" Dec 01 20:20:35 crc kubenswrapper[4802]: I1201 20:20:35.244884 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1fc2ed08-0642-4833-ae3b-1f717afa1244" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.182:8774/\": EOF" Dec 01 20:20:36 crc kubenswrapper[4802]: I1201 20:20:36.255441 4802 generic.go:334] "Generic (PLEG): container finished" podID="1fc2ed08-0642-4833-ae3b-1f717afa1244" containerID="bafece20b2cc9e3e69404cc204063d4e337a91fe9b69aed81c0b355f743d4c99" exitCode=143 Dec 01 20:20:36 crc kubenswrapper[4802]: I1201 20:20:36.255497 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1fc2ed08-0642-4833-ae3b-1f717afa1244","Type":"ContainerDied","Data":"bafece20b2cc9e3e69404cc204063d4e337a91fe9b69aed81c0b355f743d4c99"} Dec 01 20:20:37 crc kubenswrapper[4802]: I1201 20:20:37.597574 4802 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="2f6e2953-0b0e-4e75-b450-03e3faf4bdf6" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.175:8775/\": read tcp 10.217.0.2:55754->10.217.0.175:8775: read: connection reset by peer" Dec 01 20:20:37 crc kubenswrapper[4802]: I1201 20:20:37.597619 4802 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="2f6e2953-0b0e-4e75-b450-03e3faf4bdf6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.175:8775/\": read tcp 10.217.0.2:55760->10.217.0.175:8775: read: connection reset by peer" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.118160 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.255511 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-combined-ca-bundle\") pod \"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6\" (UID: \"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6\") " Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.255582 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-config-data\") pod \"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6\" (UID: \"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6\") " Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.255632 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-nova-metadata-tls-certs\") pod \"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6\" (UID: \"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6\") " Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.255660 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdvbs\" (UniqueName: \"kubernetes.io/projected/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-kube-api-access-pdvbs\") pod \"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6\" (UID: \"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6\") " Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.255703 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-logs\") pod \"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6\" (UID: \"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6\") " Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.257230 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-logs" (OuterVolumeSpecName: "logs") pod "2f6e2953-0b0e-4e75-b450-03e3faf4bdf6" (UID: "2f6e2953-0b0e-4e75-b450-03e3faf4bdf6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.268585 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-kube-api-access-pdvbs" (OuterVolumeSpecName: "kube-api-access-pdvbs") pod "2f6e2953-0b0e-4e75-b450-03e3faf4bdf6" (UID: "2f6e2953-0b0e-4e75-b450-03e3faf4bdf6"). InnerVolumeSpecName "kube-api-access-pdvbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.342176 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-config-data" (OuterVolumeSpecName: "config-data") pod "2f6e2953-0b0e-4e75-b450-03e3faf4bdf6" (UID: "2f6e2953-0b0e-4e75-b450-03e3faf4bdf6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.347067 4802 generic.go:334] "Generic (PLEG): container finished" podID="2f6e2953-0b0e-4e75-b450-03e3faf4bdf6" containerID="9f274b5d6b00a397f160480aac98e75341f613bd21121492dad893dfae438cd4" exitCode=0 Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.347125 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6","Type":"ContainerDied","Data":"9f274b5d6b00a397f160480aac98e75341f613bd21121492dad893dfae438cd4"} Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.347157 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.347188 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f6e2953-0b0e-4e75-b450-03e3faf4bdf6","Type":"ContainerDied","Data":"12690994fd94f6f3c419a26a552cbb8437cd674211387ff5519122207513ce73"} Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.347225 4802 scope.go:117] "RemoveContainer" containerID="9f274b5d6b00a397f160480aac98e75341f613bd21121492dad893dfae438cd4" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.349346 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f6e2953-0b0e-4e75-b450-03e3faf4bdf6" (UID: "2f6e2953-0b0e-4e75-b450-03e3faf4bdf6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.357604 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.357638 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.357648 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdvbs\" (UniqueName: \"kubernetes.io/projected/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-kube-api-access-pdvbs\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.357659 4802 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-logs\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.370434 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "2f6e2953-0b0e-4e75-b450-03e3faf4bdf6" (UID: "2f6e2953-0b0e-4e75-b450-03e3faf4bdf6"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.426613 4802 scope.go:117] "RemoveContainer" containerID="f02630047b4c7ca6c9c3803a4106e275e1745e409bb993ce1a1fcb2ebebc2f34" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.444156 4802 scope.go:117] "RemoveContainer" containerID="9f274b5d6b00a397f160480aac98e75341f613bd21121492dad893dfae438cd4" Dec 01 20:20:38 crc kubenswrapper[4802]: E1201 20:20:38.444708 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f274b5d6b00a397f160480aac98e75341f613bd21121492dad893dfae438cd4\": container with ID starting with 9f274b5d6b00a397f160480aac98e75341f613bd21121492dad893dfae438cd4 not found: ID does not exist" containerID="9f274b5d6b00a397f160480aac98e75341f613bd21121492dad893dfae438cd4" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.444747 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f274b5d6b00a397f160480aac98e75341f613bd21121492dad893dfae438cd4"} err="failed to get container status \"9f274b5d6b00a397f160480aac98e75341f613bd21121492dad893dfae438cd4\": rpc error: code = NotFound desc = could not find container \"9f274b5d6b00a397f160480aac98e75341f613bd21121492dad893dfae438cd4\": container with ID starting with 9f274b5d6b00a397f160480aac98e75341f613bd21121492dad893dfae438cd4 not found: ID does not exist" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.444771 4802 scope.go:117] "RemoveContainer" containerID="f02630047b4c7ca6c9c3803a4106e275e1745e409bb993ce1a1fcb2ebebc2f34" Dec 01 20:20:38 crc kubenswrapper[4802]: E1201 20:20:38.445142 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f02630047b4c7ca6c9c3803a4106e275e1745e409bb993ce1a1fcb2ebebc2f34\": container with ID starting with f02630047b4c7ca6c9c3803a4106e275e1745e409bb993ce1a1fcb2ebebc2f34 not found: ID does not exist" containerID="f02630047b4c7ca6c9c3803a4106e275e1745e409bb993ce1a1fcb2ebebc2f34" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.445225 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f02630047b4c7ca6c9c3803a4106e275e1745e409bb993ce1a1fcb2ebebc2f34"} err="failed to get container status \"f02630047b4c7ca6c9c3803a4106e275e1745e409bb993ce1a1fcb2ebebc2f34\": rpc error: code = NotFound desc = could not find container \"f02630047b4c7ca6c9c3803a4106e275e1745e409bb993ce1a1fcb2ebebc2f34\": container with ID starting with f02630047b4c7ca6c9c3803a4106e275e1745e409bb993ce1a1fcb2ebebc2f34 not found: ID does not exist" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.459424 4802 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.695223 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.709846 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.717936 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 20:20:38 crc kubenswrapper[4802]: E1201 20:20:38.719979 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77" containerName="init" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.719995 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77" containerName="init" Dec 01 20:20:38 crc kubenswrapper[4802]: E1201 20:20:38.720012 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77" containerName="dnsmasq-dns" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.720018 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77" containerName="dnsmasq-dns" Dec 01 20:20:38 crc kubenswrapper[4802]: E1201 20:20:38.720042 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f6e2953-0b0e-4e75-b450-03e3faf4bdf6" containerName="nova-metadata-log" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.720050 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f6e2953-0b0e-4e75-b450-03e3faf4bdf6" containerName="nova-metadata-log" Dec 01 20:20:38 crc kubenswrapper[4802]: E1201 20:20:38.720066 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a77beefc-0880-4f76-b2d4-4b3b1b3c42e9" containerName="nova-manage" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.720072 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="a77beefc-0880-4f76-b2d4-4b3b1b3c42e9" containerName="nova-manage" Dec 01 20:20:38 crc kubenswrapper[4802]: E1201 20:20:38.720085 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f6e2953-0b0e-4e75-b450-03e3faf4bdf6" containerName="nova-metadata-metadata" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.720090 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f6e2953-0b0e-4e75-b450-03e3faf4bdf6" containerName="nova-metadata-metadata" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.720261 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f6e2953-0b0e-4e75-b450-03e3faf4bdf6" containerName="nova-metadata-log" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.720280 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="a77beefc-0880-4f76-b2d4-4b3b1b3c42e9" containerName="nova-manage" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.720287 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f6e2953-0b0e-4e75-b450-03e3faf4bdf6" containerName="nova-metadata-metadata" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.720299 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d4d4c6a-d6b8-4f83-b7ea-47757fc32c77" containerName="dnsmasq-dns" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.725625 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.746130 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.746438 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.750060 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f6e2953-0b0e-4e75-b450-03e3faf4bdf6" path="/var/lib/kubelet/pods/2f6e2953-0b0e-4e75-b450-03e3faf4bdf6/volumes" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.764820 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77a605ed-0bb9-4c8d-9a6b-86643ff44518-config-data\") pod \"nova-metadata-0\" (UID: \"77a605ed-0bb9-4c8d-9a6b-86643ff44518\") " pod="openstack/nova-metadata-0" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.764889 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhsmq\" (UniqueName: \"kubernetes.io/projected/77a605ed-0bb9-4c8d-9a6b-86643ff44518-kube-api-access-vhsmq\") pod \"nova-metadata-0\" (UID: \"77a605ed-0bb9-4c8d-9a6b-86643ff44518\") " pod="openstack/nova-metadata-0" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.764932 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77a605ed-0bb9-4c8d-9a6b-86643ff44518-logs\") pod \"nova-metadata-0\" (UID: \"77a605ed-0bb9-4c8d-9a6b-86643ff44518\") " pod="openstack/nova-metadata-0" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.764972 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/77a605ed-0bb9-4c8d-9a6b-86643ff44518-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"77a605ed-0bb9-4c8d-9a6b-86643ff44518\") " pod="openstack/nova-metadata-0" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.765072 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77a605ed-0bb9-4c8d-9a6b-86643ff44518-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"77a605ed-0bb9-4c8d-9a6b-86643ff44518\") " pod="openstack/nova-metadata-0" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.778525 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.867386 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77a605ed-0bb9-4c8d-9a6b-86643ff44518-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"77a605ed-0bb9-4c8d-9a6b-86643ff44518\") " pod="openstack/nova-metadata-0" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.867514 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77a605ed-0bb9-4c8d-9a6b-86643ff44518-config-data\") pod \"nova-metadata-0\" (UID: \"77a605ed-0bb9-4c8d-9a6b-86643ff44518\") " pod="openstack/nova-metadata-0" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.867587 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhsmq\" (UniqueName: \"kubernetes.io/projected/77a605ed-0bb9-4c8d-9a6b-86643ff44518-kube-api-access-vhsmq\") pod \"nova-metadata-0\" (UID: \"77a605ed-0bb9-4c8d-9a6b-86643ff44518\") " pod="openstack/nova-metadata-0" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.867650 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77a605ed-0bb9-4c8d-9a6b-86643ff44518-logs\") pod \"nova-metadata-0\" (UID: \"77a605ed-0bb9-4c8d-9a6b-86643ff44518\") " pod="openstack/nova-metadata-0" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.867708 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/77a605ed-0bb9-4c8d-9a6b-86643ff44518-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"77a605ed-0bb9-4c8d-9a6b-86643ff44518\") " pod="openstack/nova-metadata-0" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.868141 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77a605ed-0bb9-4c8d-9a6b-86643ff44518-logs\") pod \"nova-metadata-0\" (UID: \"77a605ed-0bb9-4c8d-9a6b-86643ff44518\") " pod="openstack/nova-metadata-0" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.872883 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77a605ed-0bb9-4c8d-9a6b-86643ff44518-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"77a605ed-0bb9-4c8d-9a6b-86643ff44518\") " pod="openstack/nova-metadata-0" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.873090 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/77a605ed-0bb9-4c8d-9a6b-86643ff44518-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"77a605ed-0bb9-4c8d-9a6b-86643ff44518\") " pod="openstack/nova-metadata-0" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.874226 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77a605ed-0bb9-4c8d-9a6b-86643ff44518-config-data\") pod \"nova-metadata-0\" (UID: \"77a605ed-0bb9-4c8d-9a6b-86643ff44518\") " pod="openstack/nova-metadata-0" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.884207 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhsmq\" (UniqueName: \"kubernetes.io/projected/77a605ed-0bb9-4c8d-9a6b-86643ff44518-kube-api-access-vhsmq\") pod \"nova-metadata-0\" (UID: \"77a605ed-0bb9-4c8d-9a6b-86643ff44518\") " pod="openstack/nova-metadata-0" Dec 01 20:20:38 crc kubenswrapper[4802]: I1201 20:20:38.957023 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.070932 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5b7rw\" (UniqueName: \"kubernetes.io/projected/30203d04-83ed-42d6-9f9e-e7db08604e70-kube-api-access-5b7rw\") pod \"30203d04-83ed-42d6-9f9e-e7db08604e70\" (UID: \"30203d04-83ed-42d6-9f9e-e7db08604e70\") " Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.071162 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30203d04-83ed-42d6-9f9e-e7db08604e70-config-data\") pod \"30203d04-83ed-42d6-9f9e-e7db08604e70\" (UID: \"30203d04-83ed-42d6-9f9e-e7db08604e70\") " Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.071443 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30203d04-83ed-42d6-9f9e-e7db08604e70-combined-ca-bundle\") pod \"30203d04-83ed-42d6-9f9e-e7db08604e70\" (UID: \"30203d04-83ed-42d6-9f9e-e7db08604e70\") " Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.072804 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.077452 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30203d04-83ed-42d6-9f9e-e7db08604e70-kube-api-access-5b7rw" (OuterVolumeSpecName: "kube-api-access-5b7rw") pod "30203d04-83ed-42d6-9f9e-e7db08604e70" (UID: "30203d04-83ed-42d6-9f9e-e7db08604e70"). InnerVolumeSpecName "kube-api-access-5b7rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.106365 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30203d04-83ed-42d6-9f9e-e7db08604e70-config-data" (OuterVolumeSpecName: "config-data") pod "30203d04-83ed-42d6-9f9e-e7db08604e70" (UID: "30203d04-83ed-42d6-9f9e-e7db08604e70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.113623 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30203d04-83ed-42d6-9f9e-e7db08604e70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30203d04-83ed-42d6-9f9e-e7db08604e70" (UID: "30203d04-83ed-42d6-9f9e-e7db08604e70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.176013 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30203d04-83ed-42d6-9f9e-e7db08604e70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.176048 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5b7rw\" (UniqueName: \"kubernetes.io/projected/30203d04-83ed-42d6-9f9e-e7db08604e70-kube-api-access-5b7rw\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.176063 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30203d04-83ed-42d6-9f9e-e7db08604e70-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.362172 4802 generic.go:334] "Generic (PLEG): container finished" podID="30203d04-83ed-42d6-9f9e-e7db08604e70" containerID="577cff729acb0bd11fab8296589ff75931b69ace57247e4afb9b34552232dc30" exitCode=0 Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.362382 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30203d04-83ed-42d6-9f9e-e7db08604e70","Type":"ContainerDied","Data":"577cff729acb0bd11fab8296589ff75931b69ace57247e4afb9b34552232dc30"} Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.362456 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.362466 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30203d04-83ed-42d6-9f9e-e7db08604e70","Type":"ContainerDied","Data":"6c8b5fd32d55e532f771586a91679b1198120b2cdaa01798a2f95bfd34fd9c1e"} Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.362489 4802 scope.go:117] "RemoveContainer" containerID="577cff729acb0bd11fab8296589ff75931b69ace57247e4afb9b34552232dc30" Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.404605 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.405532 4802 scope.go:117] "RemoveContainer" containerID="577cff729acb0bd11fab8296589ff75931b69ace57247e4afb9b34552232dc30" Dec 01 20:20:39 crc kubenswrapper[4802]: E1201 20:20:39.406503 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"577cff729acb0bd11fab8296589ff75931b69ace57247e4afb9b34552232dc30\": container with ID starting with 577cff729acb0bd11fab8296589ff75931b69ace57247e4afb9b34552232dc30 not found: ID does not exist" containerID="577cff729acb0bd11fab8296589ff75931b69ace57247e4afb9b34552232dc30" Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.406554 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"577cff729acb0bd11fab8296589ff75931b69ace57247e4afb9b34552232dc30"} err="failed to get container status \"577cff729acb0bd11fab8296589ff75931b69ace57247e4afb9b34552232dc30\": rpc error: code = NotFound desc = could not find container \"577cff729acb0bd11fab8296589ff75931b69ace57247e4afb9b34552232dc30\": container with ID starting with 577cff729acb0bd11fab8296589ff75931b69ace57247e4afb9b34552232dc30 not found: ID does not exist" Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.423409 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.440669 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 20:20:39 crc kubenswrapper[4802]: E1201 20:20:39.441372 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30203d04-83ed-42d6-9f9e-e7db08604e70" containerName="nova-scheduler-scheduler" Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.441396 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="30203d04-83ed-42d6-9f9e-e7db08604e70" containerName="nova-scheduler-scheduler" Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.441614 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="30203d04-83ed-42d6-9f9e-e7db08604e70" containerName="nova-scheduler-scheduler" Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.442445 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.448251 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.449855 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.484370 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eedf56f-d2da-4526-94fc-346c826a891d-config-data\") pod \"nova-scheduler-0\" (UID: \"5eedf56f-d2da-4526-94fc-346c826a891d\") " pod="openstack/nova-scheduler-0" Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.484439 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hst6d\" (UniqueName: \"kubernetes.io/projected/5eedf56f-d2da-4526-94fc-346c826a891d-kube-api-access-hst6d\") pod \"nova-scheduler-0\" (UID: \"5eedf56f-d2da-4526-94fc-346c826a891d\") " pod="openstack/nova-scheduler-0" Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.484526 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eedf56f-d2da-4526-94fc-346c826a891d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5eedf56f-d2da-4526-94fc-346c826a891d\") " pod="openstack/nova-scheduler-0" Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.555817 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 20:20:39 crc kubenswrapper[4802]: W1201 20:20:39.561627 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77a605ed_0bb9_4c8d_9a6b_86643ff44518.slice/crio-07314f7994a7cc79f4815a6050a468bfa21653fde753bc856f91c0dbd95f953d WatchSource:0}: Error finding container 07314f7994a7cc79f4815a6050a468bfa21653fde753bc856f91c0dbd95f953d: Status 404 returned error can't find the container with id 07314f7994a7cc79f4815a6050a468bfa21653fde753bc856f91c0dbd95f953d Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.589385 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eedf56f-d2da-4526-94fc-346c826a891d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5eedf56f-d2da-4526-94fc-346c826a891d\") " pod="openstack/nova-scheduler-0" Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.589528 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eedf56f-d2da-4526-94fc-346c826a891d-config-data\") pod \"nova-scheduler-0\" (UID: \"5eedf56f-d2da-4526-94fc-346c826a891d\") " pod="openstack/nova-scheduler-0" Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.589561 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hst6d\" (UniqueName: \"kubernetes.io/projected/5eedf56f-d2da-4526-94fc-346c826a891d-kube-api-access-hst6d\") pod \"nova-scheduler-0\" (UID: \"5eedf56f-d2da-4526-94fc-346c826a891d\") " pod="openstack/nova-scheduler-0" Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.594502 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eedf56f-d2da-4526-94fc-346c826a891d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5eedf56f-d2da-4526-94fc-346c826a891d\") " pod="openstack/nova-scheduler-0" Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.597952 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eedf56f-d2da-4526-94fc-346c826a891d-config-data\") pod \"nova-scheduler-0\" (UID: \"5eedf56f-d2da-4526-94fc-346c826a891d\") " pod="openstack/nova-scheduler-0" Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.607014 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hst6d\" (UniqueName: \"kubernetes.io/projected/5eedf56f-d2da-4526-94fc-346c826a891d-kube-api-access-hst6d\") pod \"nova-scheduler-0\" (UID: \"5eedf56f-d2da-4526-94fc-346c826a891d\") " pod="openstack/nova-scheduler-0" Dec 01 20:20:39 crc kubenswrapper[4802]: I1201 20:20:39.765014 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 20:20:40 crc kubenswrapper[4802]: W1201 20:20:40.229039 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5eedf56f_d2da_4526_94fc_346c826a891d.slice/crio-c46769c75520f771917d4ec7db3b10d287236d7bcbdf445e72b2a750749f0b4e WatchSource:0}: Error finding container c46769c75520f771917d4ec7db3b10d287236d7bcbdf445e72b2a750749f0b4e: Status 404 returned error can't find the container with id c46769c75520f771917d4ec7db3b10d287236d7bcbdf445e72b2a750749f0b4e Dec 01 20:20:40 crc kubenswrapper[4802]: I1201 20:20:40.231037 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 20:20:40 crc kubenswrapper[4802]: I1201 20:20:40.371703 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5eedf56f-d2da-4526-94fc-346c826a891d","Type":"ContainerStarted","Data":"c46769c75520f771917d4ec7db3b10d287236d7bcbdf445e72b2a750749f0b4e"} Dec 01 20:20:40 crc kubenswrapper[4802]: I1201 20:20:40.377042 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"77a605ed-0bb9-4c8d-9a6b-86643ff44518","Type":"ContainerStarted","Data":"89d5a411d193575264091b8bcfb13e3be25a49c30be561e6cb28c2617bceb392"} Dec 01 20:20:40 crc kubenswrapper[4802]: I1201 20:20:40.377110 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"77a605ed-0bb9-4c8d-9a6b-86643ff44518","Type":"ContainerStarted","Data":"7f21101f2bacc173b8b9f2f9020db40a73153e9ef5ba8d510dfe411be0c3da21"} Dec 01 20:20:40 crc kubenswrapper[4802]: I1201 20:20:40.377127 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"77a605ed-0bb9-4c8d-9a6b-86643ff44518","Type":"ContainerStarted","Data":"07314f7994a7cc79f4815a6050a468bfa21653fde753bc856f91c0dbd95f953d"} Dec 01 20:20:40 crc kubenswrapper[4802]: I1201 20:20:40.402569 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.402551916 podStartE2EDuration="2.402551916s" podCreationTimestamp="2025-12-01 20:20:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:20:40.396245221 +0000 UTC m=+1461.958804882" watchObservedRunningTime="2025-12-01 20:20:40.402551916 +0000 UTC m=+1461.965111557" Dec 01 20:20:40 crc kubenswrapper[4802]: I1201 20:20:40.736901 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30203d04-83ed-42d6-9f9e-e7db08604e70" path="/var/lib/kubelet/pods/30203d04-83ed-42d6-9f9e-e7db08604e70/volumes" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.120581 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.218777 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fc2ed08-0642-4833-ae3b-1f717afa1244-internal-tls-certs\") pod \"1fc2ed08-0642-4833-ae3b-1f717afa1244\" (UID: \"1fc2ed08-0642-4833-ae3b-1f717afa1244\") " Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.218853 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fc2ed08-0642-4833-ae3b-1f717afa1244-public-tls-certs\") pod \"1fc2ed08-0642-4833-ae3b-1f717afa1244\" (UID: \"1fc2ed08-0642-4833-ae3b-1f717afa1244\") " Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.218889 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc2ed08-0642-4833-ae3b-1f717afa1244-config-data\") pod \"1fc2ed08-0642-4833-ae3b-1f717afa1244\" (UID: \"1fc2ed08-0642-4833-ae3b-1f717afa1244\") " Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.218963 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fc2ed08-0642-4833-ae3b-1f717afa1244-logs\") pod \"1fc2ed08-0642-4833-ae3b-1f717afa1244\" (UID: \"1fc2ed08-0642-4833-ae3b-1f717afa1244\") " Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.219360 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm7nq\" (UniqueName: \"kubernetes.io/projected/1fc2ed08-0642-4833-ae3b-1f717afa1244-kube-api-access-pm7nq\") pod \"1fc2ed08-0642-4833-ae3b-1f717afa1244\" (UID: \"1fc2ed08-0642-4833-ae3b-1f717afa1244\") " Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.219473 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc2ed08-0642-4833-ae3b-1f717afa1244-combined-ca-bundle\") pod \"1fc2ed08-0642-4833-ae3b-1f717afa1244\" (UID: \"1fc2ed08-0642-4833-ae3b-1f717afa1244\") " Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.219579 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fc2ed08-0642-4833-ae3b-1f717afa1244-logs" (OuterVolumeSpecName: "logs") pod "1fc2ed08-0642-4833-ae3b-1f717afa1244" (UID: "1fc2ed08-0642-4833-ae3b-1f717afa1244"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.220505 4802 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fc2ed08-0642-4833-ae3b-1f717afa1244-logs\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.224675 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fc2ed08-0642-4833-ae3b-1f717afa1244-kube-api-access-pm7nq" (OuterVolumeSpecName: "kube-api-access-pm7nq") pod "1fc2ed08-0642-4833-ae3b-1f717afa1244" (UID: "1fc2ed08-0642-4833-ae3b-1f717afa1244"). InnerVolumeSpecName "kube-api-access-pm7nq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.244825 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc2ed08-0642-4833-ae3b-1f717afa1244-config-data" (OuterVolumeSpecName: "config-data") pod "1fc2ed08-0642-4833-ae3b-1f717afa1244" (UID: "1fc2ed08-0642-4833-ae3b-1f717afa1244"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.248070 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc2ed08-0642-4833-ae3b-1f717afa1244-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fc2ed08-0642-4833-ae3b-1f717afa1244" (UID: "1fc2ed08-0642-4833-ae3b-1f717afa1244"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.267240 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc2ed08-0642-4833-ae3b-1f717afa1244-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1fc2ed08-0642-4833-ae3b-1f717afa1244" (UID: "1fc2ed08-0642-4833-ae3b-1f717afa1244"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.276497 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc2ed08-0642-4833-ae3b-1f717afa1244-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1fc2ed08-0642-4833-ae3b-1f717afa1244" (UID: "1fc2ed08-0642-4833-ae3b-1f717afa1244"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.322875 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fc2ed08-0642-4833-ae3b-1f717afa1244-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.322918 4802 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fc2ed08-0642-4833-ae3b-1f717afa1244-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.322930 4802 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fc2ed08-0642-4833-ae3b-1f717afa1244-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.322939 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fc2ed08-0642-4833-ae3b-1f717afa1244-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.322951 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm7nq\" (UniqueName: \"kubernetes.io/projected/1fc2ed08-0642-4833-ae3b-1f717afa1244-kube-api-access-pm7nq\") on node \"crc\" DevicePath \"\"" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.385845 4802 generic.go:334] "Generic (PLEG): container finished" podID="1fc2ed08-0642-4833-ae3b-1f717afa1244" containerID="e89fb39b15934809f3440d94ba26d02548eb90ca9b01d84c9605418f44984bff" exitCode=0 Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.385963 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.385970 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1fc2ed08-0642-4833-ae3b-1f717afa1244","Type":"ContainerDied","Data":"e89fb39b15934809f3440d94ba26d02548eb90ca9b01d84c9605418f44984bff"} Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.386006 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1fc2ed08-0642-4833-ae3b-1f717afa1244","Type":"ContainerDied","Data":"175232f7b661682157fd7201ec4af2059c339bc2a7979490adb48be580f4d7f0"} Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.386026 4802 scope.go:117] "RemoveContainer" containerID="e89fb39b15934809f3440d94ba26d02548eb90ca9b01d84c9605418f44984bff" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.387644 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5eedf56f-d2da-4526-94fc-346c826a891d","Type":"ContainerStarted","Data":"011ae81aca95d1be82d74e40f16907c4b9e8c9e64cff7de835b654c4c64af396"} Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.409339 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.409322062 podStartE2EDuration="2.409322062s" podCreationTimestamp="2025-12-01 20:20:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:20:41.405546955 +0000 UTC m=+1462.968106616" watchObservedRunningTime="2025-12-01 20:20:41.409322062 +0000 UTC m=+1462.971881693" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.444360 4802 scope.go:117] "RemoveContainer" containerID="bafece20b2cc9e3e69404cc204063d4e337a91fe9b69aed81c0b355f743d4c99" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.446907 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.456037 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.476105 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 20:20:41 crc kubenswrapper[4802]: E1201 20:20:41.477187 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc2ed08-0642-4833-ae3b-1f717afa1244" containerName="nova-api-api" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.477230 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc2ed08-0642-4833-ae3b-1f717afa1244" containerName="nova-api-api" Dec 01 20:20:41 crc kubenswrapper[4802]: E1201 20:20:41.477303 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc2ed08-0642-4833-ae3b-1f717afa1244" containerName="nova-api-log" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.477314 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc2ed08-0642-4833-ae3b-1f717afa1244" containerName="nova-api-log" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.484048 4802 scope.go:117] "RemoveContainer" containerID="e89fb39b15934809f3440d94ba26d02548eb90ca9b01d84c9605418f44984bff" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.484653 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fc2ed08-0642-4833-ae3b-1f717afa1244" containerName="nova-api-api" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.484764 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fc2ed08-0642-4833-ae3b-1f717afa1244" containerName="nova-api-log" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.487548 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.489464 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.491977 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.492842 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 01 20:20:41 crc kubenswrapper[4802]: E1201 20:20:41.493214 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e89fb39b15934809f3440d94ba26d02548eb90ca9b01d84c9605418f44984bff\": container with ID starting with e89fb39b15934809f3440d94ba26d02548eb90ca9b01d84c9605418f44984bff not found: ID does not exist" containerID="e89fb39b15934809f3440d94ba26d02548eb90ca9b01d84c9605418f44984bff" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.493374 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e89fb39b15934809f3440d94ba26d02548eb90ca9b01d84c9605418f44984bff"} err="failed to get container status \"e89fb39b15934809f3440d94ba26d02548eb90ca9b01d84c9605418f44984bff\": rpc error: code = NotFound desc = could not find container \"e89fb39b15934809f3440d94ba26d02548eb90ca9b01d84c9605418f44984bff\": container with ID starting with e89fb39b15934809f3440d94ba26d02548eb90ca9b01d84c9605418f44984bff not found: ID does not exist" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.493421 4802 scope.go:117] "RemoveContainer" containerID="bafece20b2cc9e3e69404cc204063d4e337a91fe9b69aed81c0b355f743d4c99" Dec 01 20:20:41 crc kubenswrapper[4802]: E1201 20:20:41.494261 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bafece20b2cc9e3e69404cc204063d4e337a91fe9b69aed81c0b355f743d4c99\": container with ID starting with bafece20b2cc9e3e69404cc204063d4e337a91fe9b69aed81c0b355f743d4c99 not found: ID does not exist" containerID="bafece20b2cc9e3e69404cc204063d4e337a91fe9b69aed81c0b355f743d4c99" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.494290 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bafece20b2cc9e3e69404cc204063d4e337a91fe9b69aed81c0b355f743d4c99"} err="failed to get container status \"bafece20b2cc9e3e69404cc204063d4e337a91fe9b69aed81c0b355f743d4c99\": rpc error: code = NotFound desc = could not find container \"bafece20b2cc9e3e69404cc204063d4e337a91fe9b69aed81c0b355f743d4c99\": container with ID starting with bafece20b2cc9e3e69404cc204063d4e337a91fe9b69aed81c0b355f743d4c99 not found: ID does not exist" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.497613 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.526151 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb\") " pod="openstack/nova-api-0" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.526250 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb-public-tls-certs\") pod \"nova-api-0\" (UID: \"9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb\") " pod="openstack/nova-api-0" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.526334 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkl2s\" (UniqueName: \"kubernetes.io/projected/9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb-kube-api-access-rkl2s\") pod \"nova-api-0\" (UID: \"9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb\") " pod="openstack/nova-api-0" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.526369 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb-config-data\") pod \"nova-api-0\" (UID: \"9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb\") " pod="openstack/nova-api-0" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.526430 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb-logs\") pod \"nova-api-0\" (UID: \"9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb\") " pod="openstack/nova-api-0" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.526464 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb\") " pod="openstack/nova-api-0" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.627573 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkl2s\" (UniqueName: \"kubernetes.io/projected/9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb-kube-api-access-rkl2s\") pod \"nova-api-0\" (UID: \"9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb\") " pod="openstack/nova-api-0" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.627954 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb-config-data\") pod \"nova-api-0\" (UID: \"9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb\") " pod="openstack/nova-api-0" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.628181 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb-logs\") pod \"nova-api-0\" (UID: \"9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb\") " pod="openstack/nova-api-0" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.628462 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb\") " pod="openstack/nova-api-0" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.628660 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb-logs\") pod \"nova-api-0\" (UID: \"9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb\") " pod="openstack/nova-api-0" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.628925 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb\") " pod="openstack/nova-api-0" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.629242 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb-public-tls-certs\") pod \"nova-api-0\" (UID: \"9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb\") " pod="openstack/nova-api-0" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.633498 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb-config-data\") pod \"nova-api-0\" (UID: \"9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb\") " pod="openstack/nova-api-0" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.633505 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb-public-tls-certs\") pod \"nova-api-0\" (UID: \"9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb\") " pod="openstack/nova-api-0" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.633612 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb\") " pod="openstack/nova-api-0" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.642003 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb\") " pod="openstack/nova-api-0" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.644721 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkl2s\" (UniqueName: \"kubernetes.io/projected/9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb-kube-api-access-rkl2s\") pod \"nova-api-0\" (UID: \"9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb\") " pod="openstack/nova-api-0" Dec 01 20:20:41 crc kubenswrapper[4802]: I1201 20:20:41.825557 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 20:20:42 crc kubenswrapper[4802]: I1201 20:20:42.282530 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 20:20:42 crc kubenswrapper[4802]: I1201 20:20:42.399342 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb","Type":"ContainerStarted","Data":"da12751005e2951de0d154707366d31c95378ec6a142912ea557012460a030f4"} Dec 01 20:20:42 crc kubenswrapper[4802]: I1201 20:20:42.735596 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fc2ed08-0642-4833-ae3b-1f717afa1244" path="/var/lib/kubelet/pods/1fc2ed08-0642-4833-ae3b-1f717afa1244/volumes" Dec 01 20:20:43 crc kubenswrapper[4802]: I1201 20:20:43.411305 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb","Type":"ContainerStarted","Data":"e58bb62ec128fb1225b2250a0e02badff1d018c414c2f14acce14c2e5de1c82d"} Dec 01 20:20:43 crc kubenswrapper[4802]: I1201 20:20:43.411346 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb","Type":"ContainerStarted","Data":"83df8ab7d80d7c7a64286a433771b081e2379d125afd252d9d0c655fc753a9b2"} Dec 01 20:20:43 crc kubenswrapper[4802]: I1201 20:20:43.431733 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.431715007 podStartE2EDuration="2.431715007s" podCreationTimestamp="2025-12-01 20:20:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:20:43.429262231 +0000 UTC m=+1464.991821882" watchObservedRunningTime="2025-12-01 20:20:43.431715007 +0000 UTC m=+1464.994274638" Dec 01 20:20:43 crc kubenswrapper[4802]: I1201 20:20:43.965701 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8f8gt"] Dec 01 20:20:43 crc kubenswrapper[4802]: I1201 20:20:43.968138 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8f8gt" Dec 01 20:20:43 crc kubenswrapper[4802]: I1201 20:20:43.978634 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8f8gt"] Dec 01 20:20:44 crc kubenswrapper[4802]: I1201 20:20:44.073171 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 20:20:44 crc kubenswrapper[4802]: I1201 20:20:44.073254 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 20:20:44 crc kubenswrapper[4802]: I1201 20:20:44.077748 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgndh\" (UniqueName: \"kubernetes.io/projected/7ae25bc9-d72e-42ec-8191-c606ebd53114-kube-api-access-qgndh\") pod \"redhat-operators-8f8gt\" (UID: \"7ae25bc9-d72e-42ec-8191-c606ebd53114\") " pod="openshift-marketplace/redhat-operators-8f8gt" Dec 01 20:20:44 crc kubenswrapper[4802]: I1201 20:20:44.077797 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ae25bc9-d72e-42ec-8191-c606ebd53114-utilities\") pod \"redhat-operators-8f8gt\" (UID: \"7ae25bc9-d72e-42ec-8191-c606ebd53114\") " pod="openshift-marketplace/redhat-operators-8f8gt" Dec 01 20:20:44 crc kubenswrapper[4802]: I1201 20:20:44.078069 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ae25bc9-d72e-42ec-8191-c606ebd53114-catalog-content\") pod \"redhat-operators-8f8gt\" (UID: \"7ae25bc9-d72e-42ec-8191-c606ebd53114\") " pod="openshift-marketplace/redhat-operators-8f8gt" Dec 01 20:20:44 crc kubenswrapper[4802]: I1201 20:20:44.179840 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ae25bc9-d72e-42ec-8191-c606ebd53114-catalog-content\") pod \"redhat-operators-8f8gt\" (UID: \"7ae25bc9-d72e-42ec-8191-c606ebd53114\") " pod="openshift-marketplace/redhat-operators-8f8gt" Dec 01 20:20:44 crc kubenswrapper[4802]: I1201 20:20:44.179960 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgndh\" (UniqueName: \"kubernetes.io/projected/7ae25bc9-d72e-42ec-8191-c606ebd53114-kube-api-access-qgndh\") pod \"redhat-operators-8f8gt\" (UID: \"7ae25bc9-d72e-42ec-8191-c606ebd53114\") " pod="openshift-marketplace/redhat-operators-8f8gt" Dec 01 20:20:44 crc kubenswrapper[4802]: I1201 20:20:44.180021 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ae25bc9-d72e-42ec-8191-c606ebd53114-utilities\") pod \"redhat-operators-8f8gt\" (UID: \"7ae25bc9-d72e-42ec-8191-c606ebd53114\") " pod="openshift-marketplace/redhat-operators-8f8gt" Dec 01 20:20:44 crc kubenswrapper[4802]: I1201 20:20:44.180905 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ae25bc9-d72e-42ec-8191-c606ebd53114-catalog-content\") pod \"redhat-operators-8f8gt\" (UID: \"7ae25bc9-d72e-42ec-8191-c606ebd53114\") " pod="openshift-marketplace/redhat-operators-8f8gt" Dec 01 20:20:44 crc kubenswrapper[4802]: I1201 20:20:44.181013 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ae25bc9-d72e-42ec-8191-c606ebd53114-utilities\") pod \"redhat-operators-8f8gt\" (UID: \"7ae25bc9-d72e-42ec-8191-c606ebd53114\") " pod="openshift-marketplace/redhat-operators-8f8gt" Dec 01 20:20:44 crc kubenswrapper[4802]: I1201 20:20:44.199395 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgndh\" (UniqueName: \"kubernetes.io/projected/7ae25bc9-d72e-42ec-8191-c606ebd53114-kube-api-access-qgndh\") pod \"redhat-operators-8f8gt\" (UID: \"7ae25bc9-d72e-42ec-8191-c606ebd53114\") " pod="openshift-marketplace/redhat-operators-8f8gt" Dec 01 20:20:44 crc kubenswrapper[4802]: I1201 20:20:44.293748 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8f8gt" Dec 01 20:20:44 crc kubenswrapper[4802]: I1201 20:20:44.769404 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 20:20:44 crc kubenswrapper[4802]: I1201 20:20:44.786338 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8f8gt"] Dec 01 20:20:44 crc kubenswrapper[4802]: W1201 20:20:44.791683 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ae25bc9_d72e_42ec_8191_c606ebd53114.slice/crio-795e184802371c282392cc33229d1a3b13c9461065045bf78e9c9d51b4c1879b WatchSource:0}: Error finding container 795e184802371c282392cc33229d1a3b13c9461065045bf78e9c9d51b4c1879b: Status 404 returned error can't find the container with id 795e184802371c282392cc33229d1a3b13c9461065045bf78e9c9d51b4c1879b Dec 01 20:20:45 crc kubenswrapper[4802]: I1201 20:20:45.434997 4802 generic.go:334] "Generic (PLEG): container finished" podID="7ae25bc9-d72e-42ec-8191-c606ebd53114" containerID="8eb8d408013d548e2b21a7af118fc528f4a634517c65e32c0854ce463080162d" exitCode=0 Dec 01 20:20:45 crc kubenswrapper[4802]: I1201 20:20:45.435041 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8f8gt" event={"ID":"7ae25bc9-d72e-42ec-8191-c606ebd53114","Type":"ContainerDied","Data":"8eb8d408013d548e2b21a7af118fc528f4a634517c65e32c0854ce463080162d"} Dec 01 20:20:45 crc kubenswrapper[4802]: I1201 20:20:45.435307 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8f8gt" event={"ID":"7ae25bc9-d72e-42ec-8191-c606ebd53114","Type":"ContainerStarted","Data":"795e184802371c282392cc33229d1a3b13c9461065045bf78e9c9d51b4c1879b"} Dec 01 20:20:47 crc kubenswrapper[4802]: I1201 20:20:47.455040 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8f8gt" event={"ID":"7ae25bc9-d72e-42ec-8191-c606ebd53114","Type":"ContainerStarted","Data":"ef23af05e065fa479bcc57249e1de228369ea2728c5605db45721d0949dad39b"} Dec 01 20:20:48 crc kubenswrapper[4802]: I1201 20:20:48.467851 4802 generic.go:334] "Generic (PLEG): container finished" podID="7ae25bc9-d72e-42ec-8191-c606ebd53114" containerID="ef23af05e065fa479bcc57249e1de228369ea2728c5605db45721d0949dad39b" exitCode=0 Dec 01 20:20:48 crc kubenswrapper[4802]: I1201 20:20:48.467898 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8f8gt" event={"ID":"7ae25bc9-d72e-42ec-8191-c606ebd53114","Type":"ContainerDied","Data":"ef23af05e065fa479bcc57249e1de228369ea2728c5605db45721d0949dad39b"} Dec 01 20:20:49 crc kubenswrapper[4802]: I1201 20:20:49.074564 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 20:20:49 crc kubenswrapper[4802]: I1201 20:20:49.074949 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 20:20:49 crc kubenswrapper[4802]: I1201 20:20:49.766409 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 20:20:49 crc kubenswrapper[4802]: I1201 20:20:49.794486 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 20:20:50 crc kubenswrapper[4802]: I1201 20:20:50.203649 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="77a605ed-0bb9-4c8d-9a6b-86643ff44518" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.185:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 20:20:50 crc kubenswrapper[4802]: I1201 20:20:50.204029 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="77a605ed-0bb9-4c8d-9a6b-86643ff44518" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.185:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 20:20:50 crc kubenswrapper[4802]: I1201 20:20:50.537435 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 20:20:51 crc kubenswrapper[4802]: I1201 20:20:51.513339 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8f8gt" event={"ID":"7ae25bc9-d72e-42ec-8191-c606ebd53114","Type":"ContainerStarted","Data":"96b53715efe83c0590f1d659218f4129c0229597643ac1f481672c93ca7195bf"} Dec 01 20:20:51 crc kubenswrapper[4802]: I1201 20:20:51.536260 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8f8gt" podStartSLOduration=3.246986229 podStartE2EDuration="8.536247652s" podCreationTimestamp="2025-12-01 20:20:43 +0000 UTC" firstStartedPulling="2025-12-01 20:20:45.43788104 +0000 UTC m=+1467.000440681" lastFinishedPulling="2025-12-01 20:20:50.727142423 +0000 UTC m=+1472.289702104" observedRunningTime="2025-12-01 20:20:51.529365519 +0000 UTC m=+1473.091925160" watchObservedRunningTime="2025-12-01 20:20:51.536247652 +0000 UTC m=+1473.098807293" Dec 01 20:20:51 crc kubenswrapper[4802]: I1201 20:20:51.826252 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 20:20:51 crc kubenswrapper[4802]: I1201 20:20:51.826300 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 20:20:52 crc kubenswrapper[4802]: I1201 20:20:52.836570 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.187:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 20:20:52 crc kubenswrapper[4802]: I1201 20:20:52.836617 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.187:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 20:20:54 crc kubenswrapper[4802]: I1201 20:20:54.294231 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8f8gt" Dec 01 20:20:54 crc kubenswrapper[4802]: I1201 20:20:54.295353 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8f8gt" Dec 01 20:20:54 crc kubenswrapper[4802]: I1201 20:20:54.861487 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 20:20:55 crc kubenswrapper[4802]: I1201 20:20:55.340377 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8f8gt" podUID="7ae25bc9-d72e-42ec-8191-c606ebd53114" containerName="registry-server" probeResult="failure" output=< Dec 01 20:20:55 crc kubenswrapper[4802]: timeout: failed to connect service ":50051" within 1s Dec 01 20:20:55 crc kubenswrapper[4802]: > Dec 01 20:20:58 crc kubenswrapper[4802]: I1201 20:20:58.088881 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:20:58 crc kubenswrapper[4802]: I1201 20:20:58.089389 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:20:59 crc kubenswrapper[4802]: I1201 20:20:59.088575 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 20:20:59 crc kubenswrapper[4802]: I1201 20:20:59.095026 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 20:20:59 crc kubenswrapper[4802]: I1201 20:20:59.097097 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 20:20:59 crc kubenswrapper[4802]: I1201 20:20:59.625929 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 20:21:01 crc kubenswrapper[4802]: I1201 20:21:01.837008 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 20:21:01 crc kubenswrapper[4802]: I1201 20:21:01.837868 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 20:21:01 crc kubenswrapper[4802]: I1201 20:21:01.843738 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 20:21:01 crc kubenswrapper[4802]: I1201 20:21:01.845009 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 20:21:02 crc kubenswrapper[4802]: I1201 20:21:02.634639 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 20:21:02 crc kubenswrapper[4802]: I1201 20:21:02.644999 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 20:21:04 crc kubenswrapper[4802]: I1201 20:21:04.341673 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8f8gt" Dec 01 20:21:04 crc kubenswrapper[4802]: I1201 20:21:04.389836 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8f8gt" Dec 01 20:21:04 crc kubenswrapper[4802]: I1201 20:21:04.579417 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8f8gt"] Dec 01 20:21:05 crc kubenswrapper[4802]: I1201 20:21:05.673910 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8f8gt" podUID="7ae25bc9-d72e-42ec-8191-c606ebd53114" containerName="registry-server" containerID="cri-o://96b53715efe83c0590f1d659218f4129c0229597643ac1f481672c93ca7195bf" gracePeriod=2 Dec 01 20:21:06 crc kubenswrapper[4802]: I1201 20:21:06.155593 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8f8gt" Dec 01 20:21:06 crc kubenswrapper[4802]: I1201 20:21:06.511873 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ae25bc9-d72e-42ec-8191-c606ebd53114-utilities\") pod \"7ae25bc9-d72e-42ec-8191-c606ebd53114\" (UID: \"7ae25bc9-d72e-42ec-8191-c606ebd53114\") " Dec 01 20:21:06 crc kubenswrapper[4802]: I1201 20:21:06.511925 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ae25bc9-d72e-42ec-8191-c606ebd53114-catalog-content\") pod \"7ae25bc9-d72e-42ec-8191-c606ebd53114\" (UID: \"7ae25bc9-d72e-42ec-8191-c606ebd53114\") " Dec 01 20:21:06 crc kubenswrapper[4802]: I1201 20:21:06.512012 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgndh\" (UniqueName: \"kubernetes.io/projected/7ae25bc9-d72e-42ec-8191-c606ebd53114-kube-api-access-qgndh\") pod \"7ae25bc9-d72e-42ec-8191-c606ebd53114\" (UID: \"7ae25bc9-d72e-42ec-8191-c606ebd53114\") " Dec 01 20:21:06 crc kubenswrapper[4802]: I1201 20:21:06.516318 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ae25bc9-d72e-42ec-8191-c606ebd53114-utilities" (OuterVolumeSpecName: "utilities") pod "7ae25bc9-d72e-42ec-8191-c606ebd53114" (UID: "7ae25bc9-d72e-42ec-8191-c606ebd53114"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:21:06 crc kubenswrapper[4802]: I1201 20:21:06.525356 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ae25bc9-d72e-42ec-8191-c606ebd53114-kube-api-access-qgndh" (OuterVolumeSpecName: "kube-api-access-qgndh") pod "7ae25bc9-d72e-42ec-8191-c606ebd53114" (UID: "7ae25bc9-d72e-42ec-8191-c606ebd53114"). InnerVolumeSpecName "kube-api-access-qgndh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:21:06 crc kubenswrapper[4802]: I1201 20:21:06.613021 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ae25bc9-d72e-42ec-8191-c606ebd53114-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:06 crc kubenswrapper[4802]: I1201 20:21:06.613057 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgndh\" (UniqueName: \"kubernetes.io/projected/7ae25bc9-d72e-42ec-8191-c606ebd53114-kube-api-access-qgndh\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:06 crc kubenswrapper[4802]: I1201 20:21:06.651369 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ae25bc9-d72e-42ec-8191-c606ebd53114-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ae25bc9-d72e-42ec-8191-c606ebd53114" (UID: "7ae25bc9-d72e-42ec-8191-c606ebd53114"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:21:06 crc kubenswrapper[4802]: I1201 20:21:06.684137 4802 generic.go:334] "Generic (PLEG): container finished" podID="7ae25bc9-d72e-42ec-8191-c606ebd53114" containerID="96b53715efe83c0590f1d659218f4129c0229597643ac1f481672c93ca7195bf" exitCode=0 Dec 01 20:21:06 crc kubenswrapper[4802]: I1201 20:21:06.684184 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8f8gt" event={"ID":"7ae25bc9-d72e-42ec-8191-c606ebd53114","Type":"ContainerDied","Data":"96b53715efe83c0590f1d659218f4129c0229597643ac1f481672c93ca7195bf"} Dec 01 20:21:06 crc kubenswrapper[4802]: I1201 20:21:06.684204 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8f8gt" Dec 01 20:21:06 crc kubenswrapper[4802]: I1201 20:21:06.684228 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8f8gt" event={"ID":"7ae25bc9-d72e-42ec-8191-c606ebd53114","Type":"ContainerDied","Data":"795e184802371c282392cc33229d1a3b13c9461065045bf78e9c9d51b4c1879b"} Dec 01 20:21:06 crc kubenswrapper[4802]: I1201 20:21:06.684250 4802 scope.go:117] "RemoveContainer" containerID="96b53715efe83c0590f1d659218f4129c0229597643ac1f481672c93ca7195bf" Dec 01 20:21:06 crc kubenswrapper[4802]: I1201 20:21:06.714575 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ae25bc9-d72e-42ec-8191-c606ebd53114-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:06 crc kubenswrapper[4802]: I1201 20:21:06.719625 4802 scope.go:117] "RemoveContainer" containerID="ef23af05e065fa479bcc57249e1de228369ea2728c5605db45721d0949dad39b" Dec 01 20:21:06 crc kubenswrapper[4802]: I1201 20:21:06.755564 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8f8gt"] Dec 01 20:21:06 crc kubenswrapper[4802]: I1201 20:21:06.755620 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8f8gt"] Dec 01 20:21:06 crc kubenswrapper[4802]: I1201 20:21:06.772575 4802 scope.go:117] "RemoveContainer" containerID="8eb8d408013d548e2b21a7af118fc528f4a634517c65e32c0854ce463080162d" Dec 01 20:21:06 crc kubenswrapper[4802]: I1201 20:21:06.807449 4802 scope.go:117] "RemoveContainer" containerID="96b53715efe83c0590f1d659218f4129c0229597643ac1f481672c93ca7195bf" Dec 01 20:21:06 crc kubenswrapper[4802]: E1201 20:21:06.808096 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96b53715efe83c0590f1d659218f4129c0229597643ac1f481672c93ca7195bf\": container with ID starting with 96b53715efe83c0590f1d659218f4129c0229597643ac1f481672c93ca7195bf not found: ID does not exist" containerID="96b53715efe83c0590f1d659218f4129c0229597643ac1f481672c93ca7195bf" Dec 01 20:21:06 crc kubenswrapper[4802]: I1201 20:21:06.808223 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96b53715efe83c0590f1d659218f4129c0229597643ac1f481672c93ca7195bf"} err="failed to get container status \"96b53715efe83c0590f1d659218f4129c0229597643ac1f481672c93ca7195bf\": rpc error: code = NotFound desc = could not find container \"96b53715efe83c0590f1d659218f4129c0229597643ac1f481672c93ca7195bf\": container with ID starting with 96b53715efe83c0590f1d659218f4129c0229597643ac1f481672c93ca7195bf not found: ID does not exist" Dec 01 20:21:06 crc kubenswrapper[4802]: I1201 20:21:06.808362 4802 scope.go:117] "RemoveContainer" containerID="ef23af05e065fa479bcc57249e1de228369ea2728c5605db45721d0949dad39b" Dec 01 20:21:06 crc kubenswrapper[4802]: E1201 20:21:06.810063 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef23af05e065fa479bcc57249e1de228369ea2728c5605db45721d0949dad39b\": container with ID starting with ef23af05e065fa479bcc57249e1de228369ea2728c5605db45721d0949dad39b not found: ID does not exist" containerID="ef23af05e065fa479bcc57249e1de228369ea2728c5605db45721d0949dad39b" Dec 01 20:21:06 crc kubenswrapper[4802]: I1201 20:21:06.810178 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef23af05e065fa479bcc57249e1de228369ea2728c5605db45721d0949dad39b"} err="failed to get container status \"ef23af05e065fa479bcc57249e1de228369ea2728c5605db45721d0949dad39b\": rpc error: code = NotFound desc = could not find container \"ef23af05e065fa479bcc57249e1de228369ea2728c5605db45721d0949dad39b\": container with ID starting with ef23af05e065fa479bcc57249e1de228369ea2728c5605db45721d0949dad39b not found: ID does not exist" Dec 01 20:21:06 crc kubenswrapper[4802]: I1201 20:21:06.810299 4802 scope.go:117] "RemoveContainer" containerID="8eb8d408013d548e2b21a7af118fc528f4a634517c65e32c0854ce463080162d" Dec 01 20:21:06 crc kubenswrapper[4802]: E1201 20:21:06.810754 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8eb8d408013d548e2b21a7af118fc528f4a634517c65e32c0854ce463080162d\": container with ID starting with 8eb8d408013d548e2b21a7af118fc528f4a634517c65e32c0854ce463080162d not found: ID does not exist" containerID="8eb8d408013d548e2b21a7af118fc528f4a634517c65e32c0854ce463080162d" Dec 01 20:21:06 crc kubenswrapper[4802]: I1201 20:21:06.810995 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eb8d408013d548e2b21a7af118fc528f4a634517c65e32c0854ce463080162d"} err="failed to get container status \"8eb8d408013d548e2b21a7af118fc528f4a634517c65e32c0854ce463080162d\": rpc error: code = NotFound desc = could not find container \"8eb8d408013d548e2b21a7af118fc528f4a634517c65e32c0854ce463080162d\": container with ID starting with 8eb8d408013d548e2b21a7af118fc528f4a634517c65e32c0854ce463080162d not found: ID does not exist" Dec 01 20:21:08 crc kubenswrapper[4802]: I1201 20:21:08.734128 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ae25bc9-d72e-42ec-8191-c606ebd53114" path="/var/lib/kubelet/pods/7ae25bc9-d72e-42ec-8191-c606ebd53114/volumes" Dec 01 20:21:11 crc kubenswrapper[4802]: I1201 20:21:11.077503 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 20:21:12 crc kubenswrapper[4802]: I1201 20:21:12.506490 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 20:21:15 crc kubenswrapper[4802]: I1201 20:21:15.682016 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="563ae8fc-e33c-402e-8901-79434cf68179" containerName="rabbitmq" containerID="cri-o://d549f413aff5e0e61b86a2f0b602c98653afe32757995bc57f3ef7072fec4a28" gracePeriod=604796 Dec 01 20:21:16 crc kubenswrapper[4802]: I1201 20:21:16.819479 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="97e35ed2-d0e5-4f29-8869-3740e22f5cd9" containerName="rabbitmq" containerID="cri-o://9a2ffad65cd4005ed7b8222a53e48dff7feff01c3dc4d96dfe89844ff33e7fa9" gracePeriod=604796 Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.137067 4802 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="97e35ed2-d0e5-4f29-8869-3740e22f5cd9" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.237812 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.304337 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/563ae8fc-e33c-402e-8901-79434cf68179-rabbitmq-erlang-cookie\") pod \"563ae8fc-e33c-402e-8901-79434cf68179\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.304426 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/563ae8fc-e33c-402e-8901-79434cf68179-config-data\") pod \"563ae8fc-e33c-402e-8901-79434cf68179\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.304516 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/563ae8fc-e33c-402e-8901-79434cf68179-server-conf\") pod \"563ae8fc-e33c-402e-8901-79434cf68179\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.304654 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tc9c\" (UniqueName: \"kubernetes.io/projected/563ae8fc-e33c-402e-8901-79434cf68179-kube-api-access-9tc9c\") pod \"563ae8fc-e33c-402e-8901-79434cf68179\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.304693 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/563ae8fc-e33c-402e-8901-79434cf68179-erlang-cookie-secret\") pod \"563ae8fc-e33c-402e-8901-79434cf68179\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.304744 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/563ae8fc-e33c-402e-8901-79434cf68179-rabbitmq-confd\") pod \"563ae8fc-e33c-402e-8901-79434cf68179\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.304782 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/563ae8fc-e33c-402e-8901-79434cf68179-rabbitmq-tls\") pod \"563ae8fc-e33c-402e-8901-79434cf68179\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.304829 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/563ae8fc-e33c-402e-8901-79434cf68179-pod-info\") pod \"563ae8fc-e33c-402e-8901-79434cf68179\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.304884 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/563ae8fc-e33c-402e-8901-79434cf68179-plugins-conf\") pod \"563ae8fc-e33c-402e-8901-79434cf68179\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.304906 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"563ae8fc-e33c-402e-8901-79434cf68179\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.304990 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/563ae8fc-e33c-402e-8901-79434cf68179-rabbitmq-plugins\") pod \"563ae8fc-e33c-402e-8901-79434cf68179\" (UID: \"563ae8fc-e33c-402e-8901-79434cf68179\") " Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.305179 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/563ae8fc-e33c-402e-8901-79434cf68179-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "563ae8fc-e33c-402e-8901-79434cf68179" (UID: "563ae8fc-e33c-402e-8901-79434cf68179"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.305423 4802 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/563ae8fc-e33c-402e-8901-79434cf68179-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.305951 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/563ae8fc-e33c-402e-8901-79434cf68179-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "563ae8fc-e33c-402e-8901-79434cf68179" (UID: "563ae8fc-e33c-402e-8901-79434cf68179"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.306883 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/563ae8fc-e33c-402e-8901-79434cf68179-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "563ae8fc-e33c-402e-8901-79434cf68179" (UID: "563ae8fc-e33c-402e-8901-79434cf68179"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.314210 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/563ae8fc-e33c-402e-8901-79434cf68179-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "563ae8fc-e33c-402e-8901-79434cf68179" (UID: "563ae8fc-e33c-402e-8901-79434cf68179"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.321571 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/563ae8fc-e33c-402e-8901-79434cf68179-pod-info" (OuterVolumeSpecName: "pod-info") pod "563ae8fc-e33c-402e-8901-79434cf68179" (UID: "563ae8fc-e33c-402e-8901-79434cf68179"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.323598 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "563ae8fc-e33c-402e-8901-79434cf68179" (UID: "563ae8fc-e33c-402e-8901-79434cf68179"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.329444 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/563ae8fc-e33c-402e-8901-79434cf68179-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "563ae8fc-e33c-402e-8901-79434cf68179" (UID: "563ae8fc-e33c-402e-8901-79434cf68179"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.331496 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/563ae8fc-e33c-402e-8901-79434cf68179-kube-api-access-9tc9c" (OuterVolumeSpecName: "kube-api-access-9tc9c") pod "563ae8fc-e33c-402e-8901-79434cf68179" (UID: "563ae8fc-e33c-402e-8901-79434cf68179"). InnerVolumeSpecName "kube-api-access-9tc9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.379985 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/563ae8fc-e33c-402e-8901-79434cf68179-config-data" (OuterVolumeSpecName: "config-data") pod "563ae8fc-e33c-402e-8901-79434cf68179" (UID: "563ae8fc-e33c-402e-8901-79434cf68179"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.409177 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tc9c\" (UniqueName: \"kubernetes.io/projected/563ae8fc-e33c-402e-8901-79434cf68179-kube-api-access-9tc9c\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.409727 4802 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/563ae8fc-e33c-402e-8901-79434cf68179-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.409825 4802 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/563ae8fc-e33c-402e-8901-79434cf68179-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.409902 4802 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/563ae8fc-e33c-402e-8901-79434cf68179-pod-info\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.409958 4802 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/563ae8fc-e33c-402e-8901-79434cf68179-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.410040 4802 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.410104 4802 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/563ae8fc-e33c-402e-8901-79434cf68179-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.410171 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/563ae8fc-e33c-402e-8901-79434cf68179-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.421362 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/563ae8fc-e33c-402e-8901-79434cf68179-server-conf" (OuterVolumeSpecName: "server-conf") pod "563ae8fc-e33c-402e-8901-79434cf68179" (UID: "563ae8fc-e33c-402e-8901-79434cf68179"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.455409 4802 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.499035 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/563ae8fc-e33c-402e-8901-79434cf68179-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "563ae8fc-e33c-402e-8901-79434cf68179" (UID: "563ae8fc-e33c-402e-8901-79434cf68179"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.512018 4802 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.512299 4802 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/563ae8fc-e33c-402e-8901-79434cf68179-server-conf\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.512367 4802 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/563ae8fc-e33c-402e-8901-79434cf68179-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.855212 4802 generic.go:334] "Generic (PLEG): container finished" podID="563ae8fc-e33c-402e-8901-79434cf68179" containerID="d549f413aff5e0e61b86a2f0b602c98653afe32757995bc57f3ef7072fec4a28" exitCode=0 Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.855297 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.855326 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"563ae8fc-e33c-402e-8901-79434cf68179","Type":"ContainerDied","Data":"d549f413aff5e0e61b86a2f0b602c98653afe32757995bc57f3ef7072fec4a28"} Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.855825 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"563ae8fc-e33c-402e-8901-79434cf68179","Type":"ContainerDied","Data":"85ef24648f2d17c68856a4994116a5efe34c0eb167586eb0766c35be5d059735"} Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.855846 4802 scope.go:117] "RemoveContainer" containerID="d549f413aff5e0e61b86a2f0b602c98653afe32757995bc57f3ef7072fec4a28" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.878837 4802 scope.go:117] "RemoveContainer" containerID="cfe5292ebff6919447e6cb75a9e6f7f38ca2cf0cfc55c891c7899f5e93c7c88a" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.880530 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.889587 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.899251 4802 scope.go:117] "RemoveContainer" containerID="d549f413aff5e0e61b86a2f0b602c98653afe32757995bc57f3ef7072fec4a28" Dec 01 20:21:22 crc kubenswrapper[4802]: E1201 20:21:22.899772 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d549f413aff5e0e61b86a2f0b602c98653afe32757995bc57f3ef7072fec4a28\": container with ID starting with d549f413aff5e0e61b86a2f0b602c98653afe32757995bc57f3ef7072fec4a28 not found: ID does not exist" containerID="d549f413aff5e0e61b86a2f0b602c98653afe32757995bc57f3ef7072fec4a28" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.899804 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d549f413aff5e0e61b86a2f0b602c98653afe32757995bc57f3ef7072fec4a28"} err="failed to get container status \"d549f413aff5e0e61b86a2f0b602c98653afe32757995bc57f3ef7072fec4a28\": rpc error: code = NotFound desc = could not find container \"d549f413aff5e0e61b86a2f0b602c98653afe32757995bc57f3ef7072fec4a28\": container with ID starting with d549f413aff5e0e61b86a2f0b602c98653afe32757995bc57f3ef7072fec4a28 not found: ID does not exist" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.899827 4802 scope.go:117] "RemoveContainer" containerID="cfe5292ebff6919447e6cb75a9e6f7f38ca2cf0cfc55c891c7899f5e93c7c88a" Dec 01 20:21:22 crc kubenswrapper[4802]: E1201 20:21:22.900255 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfe5292ebff6919447e6cb75a9e6f7f38ca2cf0cfc55c891c7899f5e93c7c88a\": container with ID starting with cfe5292ebff6919447e6cb75a9e6f7f38ca2cf0cfc55c891c7899f5e93c7c88a not found: ID does not exist" containerID="cfe5292ebff6919447e6cb75a9e6f7f38ca2cf0cfc55c891c7899f5e93c7c88a" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.900290 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfe5292ebff6919447e6cb75a9e6f7f38ca2cf0cfc55c891c7899f5e93c7c88a"} err="failed to get container status \"cfe5292ebff6919447e6cb75a9e6f7f38ca2cf0cfc55c891c7899f5e93c7c88a\": rpc error: code = NotFound desc = could not find container \"cfe5292ebff6919447e6cb75a9e6f7f38ca2cf0cfc55c891c7899f5e93c7c88a\": container with ID starting with cfe5292ebff6919447e6cb75a9e6f7f38ca2cf0cfc55c891c7899f5e93c7c88a not found: ID does not exist" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.922211 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 20:21:22 crc kubenswrapper[4802]: E1201 20:21:22.922645 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="563ae8fc-e33c-402e-8901-79434cf68179" containerName="setup-container" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.922669 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="563ae8fc-e33c-402e-8901-79434cf68179" containerName="setup-container" Dec 01 20:21:22 crc kubenswrapper[4802]: E1201 20:21:22.922688 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae25bc9-d72e-42ec-8191-c606ebd53114" containerName="extract-content" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.922697 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae25bc9-d72e-42ec-8191-c606ebd53114" containerName="extract-content" Dec 01 20:21:22 crc kubenswrapper[4802]: E1201 20:21:22.922722 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="563ae8fc-e33c-402e-8901-79434cf68179" containerName="rabbitmq" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.922728 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="563ae8fc-e33c-402e-8901-79434cf68179" containerName="rabbitmq" Dec 01 20:21:22 crc kubenswrapper[4802]: E1201 20:21:22.922735 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae25bc9-d72e-42ec-8191-c606ebd53114" containerName="registry-server" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.922741 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae25bc9-d72e-42ec-8191-c606ebd53114" containerName="registry-server" Dec 01 20:21:22 crc kubenswrapper[4802]: E1201 20:21:22.922758 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae25bc9-d72e-42ec-8191-c606ebd53114" containerName="extract-utilities" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.922764 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae25bc9-d72e-42ec-8191-c606ebd53114" containerName="extract-utilities" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.922941 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="563ae8fc-e33c-402e-8901-79434cf68179" containerName="rabbitmq" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.922951 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae25bc9-d72e-42ec-8191-c606ebd53114" containerName="registry-server" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.926434 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.928959 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.929378 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.929539 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.929686 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.930016 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-8cclk" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.930149 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.931938 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 01 20:21:22 crc kubenswrapper[4802]: I1201 20:21:22.944713 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.120971 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.121016 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.121047 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.121074 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.121122 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.121164 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.121188 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.121231 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.121263 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psb42\" (UniqueName: \"kubernetes.io/projected/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-kube-api-access-psb42\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.121292 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.121313 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-config-data\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.222699 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.223067 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.223120 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.223170 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.223223 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.223265 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.223288 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psb42\" (UniqueName: \"kubernetes.io/projected/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-kube-api-access-psb42\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.223304 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.223318 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-config-data\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.223347 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.223366 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.223558 4802 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.224970 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.225896 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.226833 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.226982 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.233744 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-config-data\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.236221 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.245115 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.251257 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.252011 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.254932 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psb42\" (UniqueName: \"kubernetes.io/projected/1fe488ab-29e8-4ed4-8663-be8e88c1a7ef-kube-api-access-psb42\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.287676 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef\") " pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.374642 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.386911 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.528415 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-server-conf\") pod \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.528481 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-erlang-cookie-secret\") pod \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.528506 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-rabbitmq-confd\") pod \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.528532 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-rabbitmq-plugins\") pod \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.528554 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-rabbitmq-tls\") pod \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.528599 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-rabbitmq-erlang-cookie\") pod \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.528615 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.528640 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-pod-info\") pod \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.528657 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-plugins-conf\") pod \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.528711 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptsz2\" (UniqueName: \"kubernetes.io/projected/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-kube-api-access-ptsz2\") pod \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.528764 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-config-data\") pod \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\" (UID: \"97e35ed2-d0e5-4f29-8869-3740e22f5cd9\") " Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.532898 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "97e35ed2-d0e5-4f29-8869-3740e22f5cd9" (UID: "97e35ed2-d0e5-4f29-8869-3740e22f5cd9"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.533344 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "97e35ed2-d0e5-4f29-8869-3740e22f5cd9" (UID: "97e35ed2-d0e5-4f29-8869-3740e22f5cd9"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.536819 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "97e35ed2-d0e5-4f29-8869-3740e22f5cd9" (UID: "97e35ed2-d0e5-4f29-8869-3740e22f5cd9"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.537530 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "97e35ed2-d0e5-4f29-8869-3740e22f5cd9" (UID: "97e35ed2-d0e5-4f29-8869-3740e22f5cd9"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.539629 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-pod-info" (OuterVolumeSpecName: "pod-info") pod "97e35ed2-d0e5-4f29-8869-3740e22f5cd9" (UID: "97e35ed2-d0e5-4f29-8869-3740e22f5cd9"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.541442 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-kube-api-access-ptsz2" (OuterVolumeSpecName: "kube-api-access-ptsz2") pod "97e35ed2-d0e5-4f29-8869-3740e22f5cd9" (UID: "97e35ed2-d0e5-4f29-8869-3740e22f5cd9"). InnerVolumeSpecName "kube-api-access-ptsz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.541703 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "97e35ed2-d0e5-4f29-8869-3740e22f5cd9" (UID: "97e35ed2-d0e5-4f29-8869-3740e22f5cd9"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.541746 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "97e35ed2-d0e5-4f29-8869-3740e22f5cd9" (UID: "97e35ed2-d0e5-4f29-8869-3740e22f5cd9"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.565023 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-config-data" (OuterVolumeSpecName: "config-data") pod "97e35ed2-d0e5-4f29-8869-3740e22f5cd9" (UID: "97e35ed2-d0e5-4f29-8869-3740e22f5cd9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.603124 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-server-conf" (OuterVolumeSpecName: "server-conf") pod "97e35ed2-d0e5-4f29-8869-3740e22f5cd9" (UID: "97e35ed2-d0e5-4f29-8869-3740e22f5cd9"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.630483 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptsz2\" (UniqueName: \"kubernetes.io/projected/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-kube-api-access-ptsz2\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.630516 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.630527 4802 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-server-conf\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.630538 4802 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.630547 4802 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.630555 4802 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.630564 4802 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.630590 4802 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.630600 4802 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-pod-info\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.630608 4802 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.647446 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "97e35ed2-d0e5-4f29-8869-3740e22f5cd9" (UID: "97e35ed2-d0e5-4f29-8869-3740e22f5cd9"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.650335 4802 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.731627 4802 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.731661 4802 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/97e35ed2-d0e5-4f29-8869-3740e22f5cd9-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.882012 4802 generic.go:334] "Generic (PLEG): container finished" podID="97e35ed2-d0e5-4f29-8869-3740e22f5cd9" containerID="9a2ffad65cd4005ed7b8222a53e48dff7feff01c3dc4d96dfe89844ff33e7fa9" exitCode=0 Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.882104 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"97e35ed2-d0e5-4f29-8869-3740e22f5cd9","Type":"ContainerDied","Data":"9a2ffad65cd4005ed7b8222a53e48dff7feff01c3dc4d96dfe89844ff33e7fa9"} Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.882138 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"97e35ed2-d0e5-4f29-8869-3740e22f5cd9","Type":"ContainerDied","Data":"a3e44ee4e6433e4d54b3f5b625cf43c653949c17727eafe548e6cc7dae4cfa9c"} Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.882170 4802 scope.go:117] "RemoveContainer" containerID="9a2ffad65cd4005ed7b8222a53e48dff7feff01c3dc4d96dfe89844ff33e7fa9" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.882412 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.901156 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.911579 4802 scope.go:117] "RemoveContainer" containerID="957695bc3f9738b3ae79ad36eac3d42d9c4d14740c1e9708f1951ad24b1d393f" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.928703 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 20:21:23 crc kubenswrapper[4802]: E1201 20:21:23.940930 4802 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97e35ed2_d0e5_4f29_8869_3740e22f5cd9.slice/crio-a3e44ee4e6433e4d54b3f5b625cf43c653949c17727eafe548e6cc7dae4cfa9c\": RecentStats: unable to find data in memory cache]" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.961686 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.976405 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 20:21:23 crc kubenswrapper[4802]: E1201 20:21:23.976771 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97e35ed2-d0e5-4f29-8869-3740e22f5cd9" containerName="setup-container" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.976787 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e35ed2-d0e5-4f29-8869-3740e22f5cd9" containerName="setup-container" Dec 01 20:21:23 crc kubenswrapper[4802]: E1201 20:21:23.976798 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97e35ed2-d0e5-4f29-8869-3740e22f5cd9" containerName="rabbitmq" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.976803 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e35ed2-d0e5-4f29-8869-3740e22f5cd9" containerName="rabbitmq" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.976980 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="97e35ed2-d0e5-4f29-8869-3740e22f5cd9" containerName="rabbitmq" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.977876 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.988662 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.988928 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.989028 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.989126 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.989214 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.989520 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-dfrbx" Dec 01 20:21:23 crc kubenswrapper[4802]: I1201 20:21:23.989732 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.012721 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.042055 4802 scope.go:117] "RemoveContainer" containerID="9a2ffad65cd4005ed7b8222a53e48dff7feff01c3dc4d96dfe89844ff33e7fa9" Dec 01 20:21:24 crc kubenswrapper[4802]: E1201 20:21:24.042682 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a2ffad65cd4005ed7b8222a53e48dff7feff01c3dc4d96dfe89844ff33e7fa9\": container with ID starting with 9a2ffad65cd4005ed7b8222a53e48dff7feff01c3dc4d96dfe89844ff33e7fa9 not found: ID does not exist" containerID="9a2ffad65cd4005ed7b8222a53e48dff7feff01c3dc4d96dfe89844ff33e7fa9" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.042722 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a2ffad65cd4005ed7b8222a53e48dff7feff01c3dc4d96dfe89844ff33e7fa9"} err="failed to get container status \"9a2ffad65cd4005ed7b8222a53e48dff7feff01c3dc4d96dfe89844ff33e7fa9\": rpc error: code = NotFound desc = could not find container \"9a2ffad65cd4005ed7b8222a53e48dff7feff01c3dc4d96dfe89844ff33e7fa9\": container with ID starting with 9a2ffad65cd4005ed7b8222a53e48dff7feff01c3dc4d96dfe89844ff33e7fa9 not found: ID does not exist" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.042747 4802 scope.go:117] "RemoveContainer" containerID="957695bc3f9738b3ae79ad36eac3d42d9c4d14740c1e9708f1951ad24b1d393f" Dec 01 20:21:24 crc kubenswrapper[4802]: E1201 20:21:24.043001 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"957695bc3f9738b3ae79ad36eac3d42d9c4d14740c1e9708f1951ad24b1d393f\": container with ID starting with 957695bc3f9738b3ae79ad36eac3d42d9c4d14740c1e9708f1951ad24b1d393f not found: ID does not exist" containerID="957695bc3f9738b3ae79ad36eac3d42d9c4d14740c1e9708f1951ad24b1d393f" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.043021 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"957695bc3f9738b3ae79ad36eac3d42d9c4d14740c1e9708f1951ad24b1d393f"} err="failed to get container status \"957695bc3f9738b3ae79ad36eac3d42d9c4d14740c1e9708f1951ad24b1d393f\": rpc error: code = NotFound desc = could not find container \"957695bc3f9738b3ae79ad36eac3d42d9c4d14740c1e9708f1951ad24b1d393f\": container with ID starting with 957695bc3f9738b3ae79ad36eac3d42d9c4d14740c1e9708f1951ad24b1d393f not found: ID does not exist" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.138127 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.138188 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.138261 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.138280 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.138298 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.138331 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.138360 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.138375 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.138694 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.138761 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vqrw\" (UniqueName: \"kubernetes.io/projected/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-kube-api-access-2vqrw\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.138957 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.240565 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.240620 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vqrw\" (UniqueName: \"kubernetes.io/projected/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-kube-api-access-2vqrw\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.240653 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.240693 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.240730 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.240770 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.240792 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.240815 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.240852 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.240887 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.240908 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.241126 4802 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.242076 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.242650 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.242732 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.242931 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.243639 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.246530 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.247028 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.247554 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.253688 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.263067 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vqrw\" (UniqueName: \"kubernetes.io/projected/d078b34a-6a2a-4ea0-b7c8-c99ff6942170-kube-api-access-2vqrw\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.282082 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"d078b34a-6a2a-4ea0-b7c8-c99ff6942170\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.446092 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.730076 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="563ae8fc-e33c-402e-8901-79434cf68179" path="/var/lib/kubelet/pods/563ae8fc-e33c-402e-8901-79434cf68179/volumes" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.731147 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97e35ed2-d0e5-4f29-8869-3740e22f5cd9" path="/var/lib/kubelet/pods/97e35ed2-d0e5-4f29-8869-3740e22f5cd9/volumes" Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.875600 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 20:21:24 crc kubenswrapper[4802]: W1201 20:21:24.876260 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd078b34a_6a2a_4ea0_b7c8_c99ff6942170.slice/crio-bbf9fddbb7b19095854bb30e7bc8655978a7a9a5560e26bf53c0aeba57963bbc WatchSource:0}: Error finding container bbf9fddbb7b19095854bb30e7bc8655978a7a9a5560e26bf53c0aeba57963bbc: Status 404 returned error can't find the container with id bbf9fddbb7b19095854bb30e7bc8655978a7a9a5560e26bf53c0aeba57963bbc Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.892667 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d078b34a-6a2a-4ea0-b7c8-c99ff6942170","Type":"ContainerStarted","Data":"bbf9fddbb7b19095854bb30e7bc8655978a7a9a5560e26bf53c0aeba57963bbc"} Dec 01 20:21:24 crc kubenswrapper[4802]: I1201 20:21:24.894690 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef","Type":"ContainerStarted","Data":"eb2753e5c7dc8d9203c72342614d3a50e4bee9164676ad17335d58df87965d23"} Dec 01 20:21:25 crc kubenswrapper[4802]: I1201 20:21:25.909149 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef","Type":"ContainerStarted","Data":"795279a259d3bedf874fee05058768d8132ceba2b8c8cf26f314ec40b612a47f"} Dec 01 20:21:26 crc kubenswrapper[4802]: I1201 20:21:26.921294 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d078b34a-6a2a-4ea0-b7c8-c99ff6942170","Type":"ContainerStarted","Data":"588b2af187c83a6380d2ac02f322e34bca324f94d6f9843688007cce72cac347"} Dec 01 20:21:27 crc kubenswrapper[4802]: I1201 20:21:27.776366 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-jnkxh"] Dec 01 20:21:27 crc kubenswrapper[4802]: I1201 20:21:27.778839 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" Dec 01 20:21:27 crc kubenswrapper[4802]: I1201 20:21:27.782171 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 01 20:21:27 crc kubenswrapper[4802]: I1201 20:21:27.802852 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-jnkxh"] Dec 01 20:21:27 crc kubenswrapper[4802]: I1201 20:21:27.809562 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-jnkxh\" (UID: \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" Dec 01 20:21:27 crc kubenswrapper[4802]: I1201 20:21:27.809740 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-jnkxh\" (UID: \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" Dec 01 20:21:27 crc kubenswrapper[4802]: I1201 20:21:27.809806 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-jnkxh\" (UID: \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" Dec 01 20:21:27 crc kubenswrapper[4802]: I1201 20:21:27.809913 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-config\") pod \"dnsmasq-dns-6447ccbd8f-jnkxh\" (UID: \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" Dec 01 20:21:27 crc kubenswrapper[4802]: I1201 20:21:27.809931 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxq8h\" (UniqueName: \"kubernetes.io/projected/2bbdf4f0-0691-44df-80f4-14daca08ee9f-kube-api-access-mxq8h\") pod \"dnsmasq-dns-6447ccbd8f-jnkxh\" (UID: \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" Dec 01 20:21:27 crc kubenswrapper[4802]: I1201 20:21:27.809965 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-jnkxh\" (UID: \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" Dec 01 20:21:27 crc kubenswrapper[4802]: I1201 20:21:27.911754 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-jnkxh\" (UID: \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" Dec 01 20:21:27 crc kubenswrapper[4802]: I1201 20:21:27.911821 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-jnkxh\" (UID: \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" Dec 01 20:21:27 crc kubenswrapper[4802]: I1201 20:21:27.911850 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-jnkxh\" (UID: \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" Dec 01 20:21:27 crc kubenswrapper[4802]: I1201 20:21:27.911898 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-config\") pod \"dnsmasq-dns-6447ccbd8f-jnkxh\" (UID: \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" Dec 01 20:21:27 crc kubenswrapper[4802]: I1201 20:21:27.911917 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxq8h\" (UniqueName: \"kubernetes.io/projected/2bbdf4f0-0691-44df-80f4-14daca08ee9f-kube-api-access-mxq8h\") pod \"dnsmasq-dns-6447ccbd8f-jnkxh\" (UID: \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" Dec 01 20:21:27 crc kubenswrapper[4802]: I1201 20:21:27.911941 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-jnkxh\" (UID: \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" Dec 01 20:21:27 crc kubenswrapper[4802]: I1201 20:21:27.912963 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-jnkxh\" (UID: \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" Dec 01 20:21:27 crc kubenswrapper[4802]: I1201 20:21:27.913001 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-jnkxh\" (UID: \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" Dec 01 20:21:27 crc kubenswrapper[4802]: I1201 20:21:27.913020 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-config\") pod \"dnsmasq-dns-6447ccbd8f-jnkxh\" (UID: \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" Dec 01 20:21:27 crc kubenswrapper[4802]: I1201 20:21:27.913125 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-jnkxh\" (UID: \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" Dec 01 20:21:27 crc kubenswrapper[4802]: I1201 20:21:27.913451 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-jnkxh\" (UID: \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" Dec 01 20:21:27 crc kubenswrapper[4802]: I1201 20:21:27.944372 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxq8h\" (UniqueName: \"kubernetes.io/projected/2bbdf4f0-0691-44df-80f4-14daca08ee9f-kube-api-access-mxq8h\") pod \"dnsmasq-dns-6447ccbd8f-jnkxh\" (UID: \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\") " pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" Dec 01 20:21:28 crc kubenswrapper[4802]: I1201 20:21:28.088871 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:21:28 crc kubenswrapper[4802]: I1201 20:21:28.089210 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:21:28 crc kubenswrapper[4802]: I1201 20:21:28.103576 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" Dec 01 20:21:28 crc kubenswrapper[4802]: I1201 20:21:28.582354 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-jnkxh"] Dec 01 20:21:28 crc kubenswrapper[4802]: W1201 20:21:28.584691 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bbdf4f0_0691_44df_80f4_14daca08ee9f.slice/crio-f5a97b08f87527f7d0aeca6d6b2fac73555304be99bad778767deee033244430 WatchSource:0}: Error finding container f5a97b08f87527f7d0aeca6d6b2fac73555304be99bad778767deee033244430: Status 404 returned error can't find the container with id f5a97b08f87527f7d0aeca6d6b2fac73555304be99bad778767deee033244430 Dec 01 20:21:28 crc kubenswrapper[4802]: I1201 20:21:28.924984 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx"] Dec 01 20:21:28 crc kubenswrapper[4802]: I1201 20:21:28.926885 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx" Dec 01 20:21:28 crc kubenswrapper[4802]: I1201 20:21:28.930556 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 20:21:28 crc kubenswrapper[4802]: I1201 20:21:28.933733 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wx8v9" Dec 01 20:21:28 crc kubenswrapper[4802]: I1201 20:21:28.935101 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 20:21:28 crc kubenswrapper[4802]: I1201 20:21:28.935467 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 20:21:28 crc kubenswrapper[4802]: I1201 20:21:28.950904 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx"] Dec 01 20:21:28 crc kubenswrapper[4802]: I1201 20:21:28.963617 4802 generic.go:334] "Generic (PLEG): container finished" podID="2bbdf4f0-0691-44df-80f4-14daca08ee9f" containerID="c38b6d4c5b810d41a5b58aa0bbb34e2c7cb04c58073e46a32b8328e270092bd6" exitCode=0 Dec 01 20:21:28 crc kubenswrapper[4802]: I1201 20:21:28.963662 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" event={"ID":"2bbdf4f0-0691-44df-80f4-14daca08ee9f","Type":"ContainerDied","Data":"c38b6d4c5b810d41a5b58aa0bbb34e2c7cb04c58073e46a32b8328e270092bd6"} Dec 01 20:21:28 crc kubenswrapper[4802]: I1201 20:21:28.963691 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" event={"ID":"2bbdf4f0-0691-44df-80f4-14daca08ee9f","Type":"ContainerStarted","Data":"f5a97b08f87527f7d0aeca6d6b2fac73555304be99bad778767deee033244430"} Dec 01 20:21:29 crc kubenswrapper[4802]: I1201 20:21:29.041719 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd889212-5f48-48ab-9f9a-5b028760aa6d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx\" (UID: \"fd889212-5f48-48ab-9f9a-5b028760aa6d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx" Dec 01 20:21:29 crc kubenswrapper[4802]: I1201 20:21:29.041774 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2dkh\" (UniqueName: \"kubernetes.io/projected/fd889212-5f48-48ab-9f9a-5b028760aa6d-kube-api-access-k2dkh\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx\" (UID: \"fd889212-5f48-48ab-9f9a-5b028760aa6d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx" Dec 01 20:21:29 crc kubenswrapper[4802]: I1201 20:21:29.041862 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd889212-5f48-48ab-9f9a-5b028760aa6d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx\" (UID: \"fd889212-5f48-48ab-9f9a-5b028760aa6d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx" Dec 01 20:21:29 crc kubenswrapper[4802]: I1201 20:21:29.041977 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd889212-5f48-48ab-9f9a-5b028760aa6d-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx\" (UID: \"fd889212-5f48-48ab-9f9a-5b028760aa6d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx" Dec 01 20:21:29 crc kubenswrapper[4802]: I1201 20:21:29.143176 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd889212-5f48-48ab-9f9a-5b028760aa6d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx\" (UID: \"fd889212-5f48-48ab-9f9a-5b028760aa6d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx" Dec 01 20:21:29 crc kubenswrapper[4802]: I1201 20:21:29.143486 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2dkh\" (UniqueName: \"kubernetes.io/projected/fd889212-5f48-48ab-9f9a-5b028760aa6d-kube-api-access-k2dkh\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx\" (UID: \"fd889212-5f48-48ab-9f9a-5b028760aa6d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx" Dec 01 20:21:29 crc kubenswrapper[4802]: I1201 20:21:29.143582 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd889212-5f48-48ab-9f9a-5b028760aa6d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx\" (UID: \"fd889212-5f48-48ab-9f9a-5b028760aa6d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx" Dec 01 20:21:29 crc kubenswrapper[4802]: I1201 20:21:29.143663 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd889212-5f48-48ab-9f9a-5b028760aa6d-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx\" (UID: \"fd889212-5f48-48ab-9f9a-5b028760aa6d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx" Dec 01 20:21:29 crc kubenswrapper[4802]: I1201 20:21:29.148061 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd889212-5f48-48ab-9f9a-5b028760aa6d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx\" (UID: \"fd889212-5f48-48ab-9f9a-5b028760aa6d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx" Dec 01 20:21:29 crc kubenswrapper[4802]: I1201 20:21:29.148244 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd889212-5f48-48ab-9f9a-5b028760aa6d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx\" (UID: \"fd889212-5f48-48ab-9f9a-5b028760aa6d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx" Dec 01 20:21:29 crc kubenswrapper[4802]: I1201 20:21:29.148974 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd889212-5f48-48ab-9f9a-5b028760aa6d-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx\" (UID: \"fd889212-5f48-48ab-9f9a-5b028760aa6d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx" Dec 01 20:21:29 crc kubenswrapper[4802]: I1201 20:21:29.166160 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2dkh\" (UniqueName: \"kubernetes.io/projected/fd889212-5f48-48ab-9f9a-5b028760aa6d-kube-api-access-k2dkh\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx\" (UID: \"fd889212-5f48-48ab-9f9a-5b028760aa6d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx" Dec 01 20:21:29 crc kubenswrapper[4802]: I1201 20:21:29.409285 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx" Dec 01 20:21:29 crc kubenswrapper[4802]: I1201 20:21:29.982943 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" event={"ID":"2bbdf4f0-0691-44df-80f4-14daca08ee9f","Type":"ContainerStarted","Data":"e96c6a0abb2e46394691b4dddbe5bf7e7d26106c1a695582de0e7c39ea9c73f2"} Dec 01 20:21:29 crc kubenswrapper[4802]: I1201 20:21:29.984235 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" Dec 01 20:21:30 crc kubenswrapper[4802]: I1201 20:21:30.004501 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx"] Dec 01 20:21:30 crc kubenswrapper[4802]: I1201 20:21:30.005604 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" podStartSLOduration=3.005593614 podStartE2EDuration="3.005593614s" podCreationTimestamp="2025-12-01 20:21:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:21:30.004724617 +0000 UTC m=+1511.567284258" watchObservedRunningTime="2025-12-01 20:21:30.005593614 +0000 UTC m=+1511.568153255" Dec 01 20:21:30 crc kubenswrapper[4802]: W1201 20:21:30.012341 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd889212_5f48_48ab_9f9a_5b028760aa6d.slice/crio-16b4c91e17bbcb2b8c7916241cb6740ea3b1d8c0afe991fda3d72f5b051f0e1d WatchSource:0}: Error finding container 16b4c91e17bbcb2b8c7916241cb6740ea3b1d8c0afe991fda3d72f5b051f0e1d: Status 404 returned error can't find the container with id 16b4c91e17bbcb2b8c7916241cb6740ea3b1d8c0afe991fda3d72f5b051f0e1d Dec 01 20:21:31 crc kubenswrapper[4802]: I1201 20:21:31.005084 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx" event={"ID":"fd889212-5f48-48ab-9f9a-5b028760aa6d","Type":"ContainerStarted","Data":"16b4c91e17bbcb2b8c7916241cb6740ea3b1d8c0afe991fda3d72f5b051f0e1d"} Dec 01 20:21:38 crc kubenswrapper[4802]: I1201 20:21:38.105421 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" Dec 01 20:21:38 crc kubenswrapper[4802]: I1201 20:21:38.172599 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-frqg6"] Dec 01 20:21:38 crc kubenswrapper[4802]: I1201 20:21:38.173108 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b856c5697-frqg6" podUID="ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6" containerName="dnsmasq-dns" containerID="cri-o://f353c04e5125e27586c04b29c7eb4c1d8d485e51ad407dedf67d73eb7d9d3b45" gracePeriod=10 Dec 01 20:21:38 crc kubenswrapper[4802]: I1201 20:21:38.304388 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-4klz2"] Dec 01 20:21:38 crc kubenswrapper[4802]: I1201 20:21:38.306310 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" Dec 01 20:21:38 crc kubenswrapper[4802]: I1201 20:21:38.325805 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-4klz2"] Dec 01 20:21:38 crc kubenswrapper[4802]: I1201 20:21:38.438125 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-4klz2\" (UID: \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\") " pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" Dec 01 20:21:38 crc kubenswrapper[4802]: I1201 20:21:38.438276 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-4klz2\" (UID: \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\") " pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" Dec 01 20:21:38 crc kubenswrapper[4802]: I1201 20:21:38.438337 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-494kb\" (UniqueName: \"kubernetes.io/projected/8dfcb60c-7f93-47a3-8343-2d99177de7f2-kube-api-access-494kb\") pod \"dnsmasq-dns-864d5fc68c-4klz2\" (UID: \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\") " pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" Dec 01 20:21:38 crc kubenswrapper[4802]: I1201 20:21:38.438377 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-4klz2\" (UID: \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\") " pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" Dec 01 20:21:38 crc kubenswrapper[4802]: I1201 20:21:38.438407 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-config\") pod \"dnsmasq-dns-864d5fc68c-4klz2\" (UID: \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\") " pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" Dec 01 20:21:38 crc kubenswrapper[4802]: I1201 20:21:38.438438 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-4klz2\" (UID: \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\") " pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" Dec 01 20:21:38 crc kubenswrapper[4802]: I1201 20:21:38.540235 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-4klz2\" (UID: \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\") " pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" Dec 01 20:21:38 crc kubenswrapper[4802]: I1201 20:21:38.540337 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-494kb\" (UniqueName: \"kubernetes.io/projected/8dfcb60c-7f93-47a3-8343-2d99177de7f2-kube-api-access-494kb\") pod \"dnsmasq-dns-864d5fc68c-4klz2\" (UID: \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\") " pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" Dec 01 20:21:38 crc kubenswrapper[4802]: I1201 20:21:38.540383 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-4klz2\" (UID: \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\") " pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" Dec 01 20:21:38 crc kubenswrapper[4802]: I1201 20:21:38.540415 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-config\") pod \"dnsmasq-dns-864d5fc68c-4klz2\" (UID: \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\") " pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" Dec 01 20:21:38 crc kubenswrapper[4802]: I1201 20:21:38.540440 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-4klz2\" (UID: \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\") " pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" Dec 01 20:21:38 crc kubenswrapper[4802]: I1201 20:21:38.540486 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-4klz2\" (UID: \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\") " pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" Dec 01 20:21:38 crc kubenswrapper[4802]: I1201 20:21:38.541367 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-4klz2\" (UID: \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\") " pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" Dec 01 20:21:38 crc kubenswrapper[4802]: I1201 20:21:38.541397 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-4klz2\" (UID: \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\") " pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" Dec 01 20:21:38 crc kubenswrapper[4802]: I1201 20:21:38.541696 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-4klz2\" (UID: \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\") " pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" Dec 01 20:21:38 crc kubenswrapper[4802]: I1201 20:21:38.541791 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-4klz2\" (UID: \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\") " pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" Dec 01 20:21:38 crc kubenswrapper[4802]: I1201 20:21:38.542567 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-config\") pod \"dnsmasq-dns-864d5fc68c-4klz2\" (UID: \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\") " pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" Dec 01 20:21:38 crc kubenswrapper[4802]: I1201 20:21:38.561538 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-494kb\" (UniqueName: \"kubernetes.io/projected/8dfcb60c-7f93-47a3-8343-2d99177de7f2-kube-api-access-494kb\") pod \"dnsmasq-dns-864d5fc68c-4klz2\" (UID: \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\") " pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" Dec 01 20:21:38 crc kubenswrapper[4802]: I1201 20:21:38.645725 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" Dec 01 20:21:39 crc kubenswrapper[4802]: I1201 20:21:39.117590 4802 generic.go:334] "Generic (PLEG): container finished" podID="ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6" containerID="f353c04e5125e27586c04b29c7eb4c1d8d485e51ad407dedf67d73eb7d9d3b45" exitCode=0 Dec 01 20:21:39 crc kubenswrapper[4802]: I1201 20:21:39.117642 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-frqg6" event={"ID":"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6","Type":"ContainerDied","Data":"f353c04e5125e27586c04b29c7eb4c1d8d485e51ad407dedf67d73eb7d9d3b45"} Dec 01 20:21:40 crc kubenswrapper[4802]: I1201 20:21:40.149980 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-frqg6" Dec 01 20:21:40 crc kubenswrapper[4802]: I1201 20:21:40.274809 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-ovsdbserver-sb\") pod \"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6\" (UID: \"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6\") " Dec 01 20:21:40 crc kubenswrapper[4802]: I1201 20:21:40.274877 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-config\") pod \"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6\" (UID: \"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6\") " Dec 01 20:21:40 crc kubenswrapper[4802]: I1201 20:21:40.274952 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-dns-svc\") pod \"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6\" (UID: \"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6\") " Dec 01 20:21:40 crc kubenswrapper[4802]: I1201 20:21:40.274999 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q4s9\" (UniqueName: \"kubernetes.io/projected/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-kube-api-access-9q4s9\") pod \"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6\" (UID: \"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6\") " Dec 01 20:21:40 crc kubenswrapper[4802]: I1201 20:21:40.275228 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-ovsdbserver-nb\") pod \"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6\" (UID: \"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6\") " Dec 01 20:21:40 crc kubenswrapper[4802]: I1201 20:21:40.282419 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-kube-api-access-9q4s9" (OuterVolumeSpecName: "kube-api-access-9q4s9") pod "ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6" (UID: "ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6"). InnerVolumeSpecName "kube-api-access-9q4s9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:21:40 crc kubenswrapper[4802]: I1201 20:21:40.324637 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6" (UID: "ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:21:40 crc kubenswrapper[4802]: I1201 20:21:40.329022 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6" (UID: "ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:21:40 crc kubenswrapper[4802]: I1201 20:21:40.331166 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6" (UID: "ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:21:40 crc kubenswrapper[4802]: I1201 20:21:40.335042 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-config" (OuterVolumeSpecName: "config") pod "ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6" (UID: "ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:21:40 crc kubenswrapper[4802]: I1201 20:21:40.379278 4802 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:40 crc kubenswrapper[4802]: I1201 20:21:40.379309 4802 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:40 crc kubenswrapper[4802]: I1201 20:21:40.379320 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:40 crc kubenswrapper[4802]: I1201 20:21:40.379329 4802 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:40 crc kubenswrapper[4802]: I1201 20:21:40.379916 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q4s9\" (UniqueName: \"kubernetes.io/projected/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6-kube-api-access-9q4s9\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:40 crc kubenswrapper[4802]: I1201 20:21:40.408468 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-4klz2"] Dec 01 20:21:40 crc kubenswrapper[4802]: W1201 20:21:40.424282 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dfcb60c_7f93_47a3_8343_2d99177de7f2.slice/crio-5f8d9a63eb925a6bf2579a16a99ddfaa527411c9a620cbd843574c54b2a486ba WatchSource:0}: Error finding container 5f8d9a63eb925a6bf2579a16a99ddfaa527411c9a620cbd843574c54b2a486ba: Status 404 returned error can't find the container with id 5f8d9a63eb925a6bf2579a16a99ddfaa527411c9a620cbd843574c54b2a486ba Dec 01 20:21:41 crc kubenswrapper[4802]: I1201 20:21:41.139978 4802 generic.go:334] "Generic (PLEG): container finished" podID="8dfcb60c-7f93-47a3-8343-2d99177de7f2" containerID="81f3bc7904b921b1be4b2cb626fe9c00b75fda076bd77760295e66a8837e56d5" exitCode=0 Dec 01 20:21:41 crc kubenswrapper[4802]: I1201 20:21:41.140077 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" event={"ID":"8dfcb60c-7f93-47a3-8343-2d99177de7f2","Type":"ContainerDied","Data":"81f3bc7904b921b1be4b2cb626fe9c00b75fda076bd77760295e66a8837e56d5"} Dec 01 20:21:41 crc kubenswrapper[4802]: I1201 20:21:41.140360 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" event={"ID":"8dfcb60c-7f93-47a3-8343-2d99177de7f2","Type":"ContainerStarted","Data":"5f8d9a63eb925a6bf2579a16a99ddfaa527411c9a620cbd843574c54b2a486ba"} Dec 01 20:21:41 crc kubenswrapper[4802]: I1201 20:21:41.142722 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx" event={"ID":"fd889212-5f48-48ab-9f9a-5b028760aa6d","Type":"ContainerStarted","Data":"a9e3908a710fd4d9ddd3ffa20e0dcf6be99dd2951b56d29cc36f0da2256a2480"} Dec 01 20:21:41 crc kubenswrapper[4802]: I1201 20:21:41.145472 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-frqg6" event={"ID":"ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6","Type":"ContainerDied","Data":"944e2985a3c1769855993a31a4bbfced240775d457af31a07bd21bac414dee57"} Dec 01 20:21:41 crc kubenswrapper[4802]: I1201 20:21:41.145521 4802 scope.go:117] "RemoveContainer" containerID="f353c04e5125e27586c04b29c7eb4c1d8d485e51ad407dedf67d73eb7d9d3b45" Dec 01 20:21:41 crc kubenswrapper[4802]: I1201 20:21:41.145533 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-frqg6" Dec 01 20:21:41 crc kubenswrapper[4802]: I1201 20:21:41.191867 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx" podStartSLOduration=3.218803488 podStartE2EDuration="13.191849079s" podCreationTimestamp="2025-12-01 20:21:28 +0000 UTC" firstStartedPulling="2025-12-01 20:21:30.016581974 +0000 UTC m=+1511.579141615" lastFinishedPulling="2025-12-01 20:21:39.989627565 +0000 UTC m=+1521.552187206" observedRunningTime="2025-12-01 20:21:41.179337131 +0000 UTC m=+1522.741896782" watchObservedRunningTime="2025-12-01 20:21:41.191849079 +0000 UTC m=+1522.754408730" Dec 01 20:21:41 crc kubenswrapper[4802]: I1201 20:21:41.291772 4802 scope.go:117] "RemoveContainer" containerID="791ffee4271476684fca7f33a2ce01946474fbd2f9c2d784fe9cacab5f50044c" Dec 01 20:21:41 crc kubenswrapper[4802]: I1201 20:21:41.320692 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-frqg6"] Dec 01 20:21:41 crc kubenswrapper[4802]: I1201 20:21:41.332104 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-frqg6"] Dec 01 20:21:42 crc kubenswrapper[4802]: I1201 20:21:42.164482 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" event={"ID":"8dfcb60c-7f93-47a3-8343-2d99177de7f2","Type":"ContainerStarted","Data":"3aadd1e1bb764aef360aa9ef7b8db58ba64fd44f8183f4f390db9c748c4948cd"} Dec 01 20:21:42 crc kubenswrapper[4802]: I1201 20:21:42.165008 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" Dec 01 20:21:42 crc kubenswrapper[4802]: I1201 20:21:42.207225 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" podStartSLOduration=4.20717404 podStartE2EDuration="4.20717404s" podCreationTimestamp="2025-12-01 20:21:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:21:42.204030252 +0000 UTC m=+1523.766589943" watchObservedRunningTime="2025-12-01 20:21:42.20717404 +0000 UTC m=+1523.769733711" Dec 01 20:21:42 crc kubenswrapper[4802]: I1201 20:21:42.731751 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6" path="/var/lib/kubelet/pods/ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6/volumes" Dec 01 20:21:48 crc kubenswrapper[4802]: I1201 20:21:48.647445 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" Dec 01 20:21:48 crc kubenswrapper[4802]: I1201 20:21:48.746748 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-jnkxh"] Dec 01 20:21:48 crc kubenswrapper[4802]: I1201 20:21:48.747015 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" podUID="2bbdf4f0-0691-44df-80f4-14daca08ee9f" containerName="dnsmasq-dns" containerID="cri-o://e96c6a0abb2e46394691b4dddbe5bf7e7d26106c1a695582de0e7c39ea9c73f2" gracePeriod=10 Dec 01 20:21:49 crc kubenswrapper[4802]: I1201 20:21:49.347240 4802 generic.go:334] "Generic (PLEG): container finished" podID="2bbdf4f0-0691-44df-80f4-14daca08ee9f" containerID="e96c6a0abb2e46394691b4dddbe5bf7e7d26106c1a695582de0e7c39ea9c73f2" exitCode=0 Dec 01 20:21:49 crc kubenswrapper[4802]: I1201 20:21:49.347324 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" event={"ID":"2bbdf4f0-0691-44df-80f4-14daca08ee9f","Type":"ContainerDied","Data":"e96c6a0abb2e46394691b4dddbe5bf7e7d26106c1a695582de0e7c39ea9c73f2"} Dec 01 20:21:49 crc kubenswrapper[4802]: I1201 20:21:49.347599 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" event={"ID":"2bbdf4f0-0691-44df-80f4-14daca08ee9f","Type":"ContainerDied","Data":"f5a97b08f87527f7d0aeca6d6b2fac73555304be99bad778767deee033244430"} Dec 01 20:21:49 crc kubenswrapper[4802]: I1201 20:21:49.347619 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5a97b08f87527f7d0aeca6d6b2fac73555304be99bad778767deee033244430" Dec 01 20:21:49 crc kubenswrapper[4802]: I1201 20:21:49.353047 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" Dec 01 20:21:49 crc kubenswrapper[4802]: I1201 20:21:49.470387 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-dns-svc\") pod \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\" (UID: \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\") " Dec 01 20:21:49 crc kubenswrapper[4802]: I1201 20:21:49.470427 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-ovsdbserver-sb\") pod \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\" (UID: \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\") " Dec 01 20:21:49 crc kubenswrapper[4802]: I1201 20:21:49.470510 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-openstack-edpm-ipam\") pod \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\" (UID: \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\") " Dec 01 20:21:49 crc kubenswrapper[4802]: I1201 20:21:49.470689 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-ovsdbserver-nb\") pod \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\" (UID: \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\") " Dec 01 20:21:49 crc kubenswrapper[4802]: I1201 20:21:49.470755 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxq8h\" (UniqueName: \"kubernetes.io/projected/2bbdf4f0-0691-44df-80f4-14daca08ee9f-kube-api-access-mxq8h\") pod \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\" (UID: \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\") " Dec 01 20:21:49 crc kubenswrapper[4802]: I1201 20:21:49.470780 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-config\") pod \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\" (UID: \"2bbdf4f0-0691-44df-80f4-14daca08ee9f\") " Dec 01 20:21:49 crc kubenswrapper[4802]: I1201 20:21:49.480859 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bbdf4f0-0691-44df-80f4-14daca08ee9f-kube-api-access-mxq8h" (OuterVolumeSpecName: "kube-api-access-mxq8h") pod "2bbdf4f0-0691-44df-80f4-14daca08ee9f" (UID: "2bbdf4f0-0691-44df-80f4-14daca08ee9f"). InnerVolumeSpecName "kube-api-access-mxq8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:21:49 crc kubenswrapper[4802]: I1201 20:21:49.524659 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-config" (OuterVolumeSpecName: "config") pod "2bbdf4f0-0691-44df-80f4-14daca08ee9f" (UID: "2bbdf4f0-0691-44df-80f4-14daca08ee9f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:21:49 crc kubenswrapper[4802]: I1201 20:21:49.525400 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2bbdf4f0-0691-44df-80f4-14daca08ee9f" (UID: "2bbdf4f0-0691-44df-80f4-14daca08ee9f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:21:49 crc kubenswrapper[4802]: I1201 20:21:49.525666 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2bbdf4f0-0691-44df-80f4-14daca08ee9f" (UID: "2bbdf4f0-0691-44df-80f4-14daca08ee9f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:21:49 crc kubenswrapper[4802]: I1201 20:21:49.526722 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "2bbdf4f0-0691-44df-80f4-14daca08ee9f" (UID: "2bbdf4f0-0691-44df-80f4-14daca08ee9f"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:21:49 crc kubenswrapper[4802]: I1201 20:21:49.549341 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2bbdf4f0-0691-44df-80f4-14daca08ee9f" (UID: "2bbdf4f0-0691-44df-80f4-14daca08ee9f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:21:49 crc kubenswrapper[4802]: I1201 20:21:49.572291 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxq8h\" (UniqueName: \"kubernetes.io/projected/2bbdf4f0-0691-44df-80f4-14daca08ee9f-kube-api-access-mxq8h\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:49 crc kubenswrapper[4802]: I1201 20:21:49.572336 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:49 crc kubenswrapper[4802]: I1201 20:21:49.572347 4802 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:49 crc kubenswrapper[4802]: I1201 20:21:49.572357 4802 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:49 crc kubenswrapper[4802]: I1201 20:21:49.572365 4802 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:49 crc kubenswrapper[4802]: I1201 20:21:49.572373 4802 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2bbdf4f0-0691-44df-80f4-14daca08ee9f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:50 crc kubenswrapper[4802]: I1201 20:21:50.354925 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-jnkxh" Dec 01 20:21:50 crc kubenswrapper[4802]: I1201 20:21:50.395259 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-jnkxh"] Dec 01 20:21:50 crc kubenswrapper[4802]: I1201 20:21:50.402462 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-jnkxh"] Dec 01 20:21:50 crc kubenswrapper[4802]: I1201 20:21:50.730130 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bbdf4f0-0691-44df-80f4-14daca08ee9f" path="/var/lib/kubelet/pods/2bbdf4f0-0691-44df-80f4-14daca08ee9f/volumes" Dec 01 20:21:56 crc kubenswrapper[4802]: I1201 20:21:56.413721 4802 generic.go:334] "Generic (PLEG): container finished" podID="fd889212-5f48-48ab-9f9a-5b028760aa6d" containerID="a9e3908a710fd4d9ddd3ffa20e0dcf6be99dd2951b56d29cc36f0da2256a2480" exitCode=0 Dec 01 20:21:56 crc kubenswrapper[4802]: I1201 20:21:56.413782 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx" event={"ID":"fd889212-5f48-48ab-9f9a-5b028760aa6d","Type":"ContainerDied","Data":"a9e3908a710fd4d9ddd3ffa20e0dcf6be99dd2951b56d29cc36f0da2256a2480"} Dec 01 20:21:57 crc kubenswrapper[4802]: I1201 20:21:57.892120 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.028507 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2dkh\" (UniqueName: \"kubernetes.io/projected/fd889212-5f48-48ab-9f9a-5b028760aa6d-kube-api-access-k2dkh\") pod \"fd889212-5f48-48ab-9f9a-5b028760aa6d\" (UID: \"fd889212-5f48-48ab-9f9a-5b028760aa6d\") " Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.028578 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd889212-5f48-48ab-9f9a-5b028760aa6d-inventory\") pod \"fd889212-5f48-48ab-9f9a-5b028760aa6d\" (UID: \"fd889212-5f48-48ab-9f9a-5b028760aa6d\") " Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.028890 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd889212-5f48-48ab-9f9a-5b028760aa6d-ssh-key\") pod \"fd889212-5f48-48ab-9f9a-5b028760aa6d\" (UID: \"fd889212-5f48-48ab-9f9a-5b028760aa6d\") " Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.029108 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd889212-5f48-48ab-9f9a-5b028760aa6d-repo-setup-combined-ca-bundle\") pod \"fd889212-5f48-48ab-9f9a-5b028760aa6d\" (UID: \"fd889212-5f48-48ab-9f9a-5b028760aa6d\") " Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.034501 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd889212-5f48-48ab-9f9a-5b028760aa6d-kube-api-access-k2dkh" (OuterVolumeSpecName: "kube-api-access-k2dkh") pod "fd889212-5f48-48ab-9f9a-5b028760aa6d" (UID: "fd889212-5f48-48ab-9f9a-5b028760aa6d"). InnerVolumeSpecName "kube-api-access-k2dkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.035078 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd889212-5f48-48ab-9f9a-5b028760aa6d-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "fd889212-5f48-48ab-9f9a-5b028760aa6d" (UID: "fd889212-5f48-48ab-9f9a-5b028760aa6d"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.060670 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd889212-5f48-48ab-9f9a-5b028760aa6d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fd889212-5f48-48ab-9f9a-5b028760aa6d" (UID: "fd889212-5f48-48ab-9f9a-5b028760aa6d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.062245 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd889212-5f48-48ab-9f9a-5b028760aa6d-inventory" (OuterVolumeSpecName: "inventory") pod "fd889212-5f48-48ab-9f9a-5b028760aa6d" (UID: "fd889212-5f48-48ab-9f9a-5b028760aa6d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.088871 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.088939 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.089005 4802 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.090033 4802 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619"} pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.090112 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" containerID="cri-o://ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619" gracePeriod=600 Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.131647 4802 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd889212-5f48-48ab-9f9a-5b028760aa6d-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.131859 4802 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd889212-5f48-48ab-9f9a-5b028760aa6d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.131954 4802 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd889212-5f48-48ab-9f9a-5b028760aa6d-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.132025 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2dkh\" (UniqueName: \"kubernetes.io/projected/fd889212-5f48-48ab-9f9a-5b028760aa6d-kube-api-access-k2dkh\") on node \"crc\" DevicePath \"\"" Dec 01 20:21:58 crc kubenswrapper[4802]: E1201 20:21:58.217314 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.432779 4802 generic.go:334] "Generic (PLEG): container finished" podID="1fe488ab-29e8-4ed4-8663-be8e88c1a7ef" containerID="795279a259d3bedf874fee05058768d8132ceba2b8c8cf26f314ec40b612a47f" exitCode=0 Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.432893 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef","Type":"ContainerDied","Data":"795279a259d3bedf874fee05058768d8132ceba2b8c8cf26f314ec40b612a47f"} Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.436032 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx" event={"ID":"fd889212-5f48-48ab-9f9a-5b028760aa6d","Type":"ContainerDied","Data":"16b4c91e17bbcb2b8c7916241cb6740ea3b1d8c0afe991fda3d72f5b051f0e1d"} Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.436042 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.436158 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16b4c91e17bbcb2b8c7916241cb6740ea3b1d8c0afe991fda3d72f5b051f0e1d" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.438631 4802 generic.go:334] "Generic (PLEG): container finished" podID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerID="ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619" exitCode=0 Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.438662 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerDied","Data":"ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619"} Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.438688 4802 scope.go:117] "RemoveContainer" containerID="64791094ee9b4b30a04bff1aa7e941faf215ab40eb68a1d66d92dede04b50331" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.439079 4802 scope.go:117] "RemoveContainer" containerID="ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619" Dec 01 20:21:58 crc kubenswrapper[4802]: E1201 20:21:58.439362 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.558499 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs"] Dec 01 20:21:58 crc kubenswrapper[4802]: E1201 20:21:58.558981 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bbdf4f0-0691-44df-80f4-14daca08ee9f" containerName="init" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.559008 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bbdf4f0-0691-44df-80f4-14daca08ee9f" containerName="init" Dec 01 20:21:58 crc kubenswrapper[4802]: E1201 20:21:58.559022 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd889212-5f48-48ab-9f9a-5b028760aa6d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.559036 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd889212-5f48-48ab-9f9a-5b028760aa6d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 20:21:58 crc kubenswrapper[4802]: E1201 20:21:58.559052 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6" containerName="dnsmasq-dns" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.559059 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6" containerName="dnsmasq-dns" Dec 01 20:21:58 crc kubenswrapper[4802]: E1201 20:21:58.559076 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bbdf4f0-0691-44df-80f4-14daca08ee9f" containerName="dnsmasq-dns" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.559083 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bbdf4f0-0691-44df-80f4-14daca08ee9f" containerName="dnsmasq-dns" Dec 01 20:21:58 crc kubenswrapper[4802]: E1201 20:21:58.559119 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6" containerName="init" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.559128 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6" containerName="init" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.559351 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bbdf4f0-0691-44df-80f4-14daca08ee9f" containerName="dnsmasq-dns" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.559367 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebd2de8c-0e21-4d6c-9d1d-d74600eb8ff6" containerName="dnsmasq-dns" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.559380 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd889212-5f48-48ab-9f9a-5b028760aa6d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.560118 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.565122 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.565342 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.565463 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wx8v9" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.565552 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.569969 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs"] Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.642581 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3d19da6a-24ea-49b2-9415-100c4db7bccd-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs\" (UID: \"3d19da6a-24ea-49b2-9415-100c4db7bccd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.642993 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d19da6a-24ea-49b2-9415-100c4db7bccd-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs\" (UID: \"3d19da6a-24ea-49b2-9415-100c4db7bccd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.643251 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d19da6a-24ea-49b2-9415-100c4db7bccd-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs\" (UID: \"3d19da6a-24ea-49b2-9415-100c4db7bccd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.643302 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsjvb\" (UniqueName: \"kubernetes.io/projected/3d19da6a-24ea-49b2-9415-100c4db7bccd-kube-api-access-rsjvb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs\" (UID: \"3d19da6a-24ea-49b2-9415-100c4db7bccd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.744817 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d19da6a-24ea-49b2-9415-100c4db7bccd-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs\" (UID: \"3d19da6a-24ea-49b2-9415-100c4db7bccd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.745283 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsjvb\" (UniqueName: \"kubernetes.io/projected/3d19da6a-24ea-49b2-9415-100c4db7bccd-kube-api-access-rsjvb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs\" (UID: \"3d19da6a-24ea-49b2-9415-100c4db7bccd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.745330 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3d19da6a-24ea-49b2-9415-100c4db7bccd-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs\" (UID: \"3d19da6a-24ea-49b2-9415-100c4db7bccd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.745428 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d19da6a-24ea-49b2-9415-100c4db7bccd-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs\" (UID: \"3d19da6a-24ea-49b2-9415-100c4db7bccd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.749138 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d19da6a-24ea-49b2-9415-100c4db7bccd-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs\" (UID: \"3d19da6a-24ea-49b2-9415-100c4db7bccd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.749493 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3d19da6a-24ea-49b2-9415-100c4db7bccd-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs\" (UID: \"3d19da6a-24ea-49b2-9415-100c4db7bccd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.749874 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d19da6a-24ea-49b2-9415-100c4db7bccd-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs\" (UID: \"3d19da6a-24ea-49b2-9415-100c4db7bccd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.759911 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsjvb\" (UniqueName: \"kubernetes.io/projected/3d19da6a-24ea-49b2-9415-100c4db7bccd-kube-api-access-rsjvb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs\" (UID: \"3d19da6a-24ea-49b2-9415-100c4db7bccd\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs" Dec 01 20:21:58 crc kubenswrapper[4802]: I1201 20:21:58.895031 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs" Dec 01 20:21:59 crc kubenswrapper[4802]: I1201 20:21:59.451408 4802 generic.go:334] "Generic (PLEG): container finished" podID="d078b34a-6a2a-4ea0-b7c8-c99ff6942170" containerID="588b2af187c83a6380d2ac02f322e34bca324f94d6f9843688007cce72cac347" exitCode=0 Dec 01 20:21:59 crc kubenswrapper[4802]: I1201 20:21:59.451743 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d078b34a-6a2a-4ea0-b7c8-c99ff6942170","Type":"ContainerDied","Data":"588b2af187c83a6380d2ac02f322e34bca324f94d6f9843688007cce72cac347"} Dec 01 20:21:59 crc kubenswrapper[4802]: I1201 20:21:59.453802 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs"] Dec 01 20:21:59 crc kubenswrapper[4802]: I1201 20:21:59.458104 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1fe488ab-29e8-4ed4-8663-be8e88c1a7ef","Type":"ContainerStarted","Data":"065b850cb3227efb3a7c2802d631fe139af5a39c4b526b24ea3d997501010b0c"} Dec 01 20:21:59 crc kubenswrapper[4802]: I1201 20:21:59.458637 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 01 20:21:59 crc kubenswrapper[4802]: W1201 20:21:59.464334 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d19da6a_24ea_49b2_9415_100c4db7bccd.slice/crio-1346dda3a1c5ffee678d5495b2a8e90ba9e76c4ec45f74be4a9fe96d5aa5eacb WatchSource:0}: Error finding container 1346dda3a1c5ffee678d5495b2a8e90ba9e76c4ec45f74be4a9fe96d5aa5eacb: Status 404 returned error can't find the container with id 1346dda3a1c5ffee678d5495b2a8e90ba9e76c4ec45f74be4a9fe96d5aa5eacb Dec 01 20:21:59 crc kubenswrapper[4802]: I1201 20:21:59.506850 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.506826802 podStartE2EDuration="37.506826802s" podCreationTimestamp="2025-12-01 20:21:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:21:59.498791134 +0000 UTC m=+1541.061350785" watchObservedRunningTime="2025-12-01 20:21:59.506826802 +0000 UTC m=+1541.069386443" Dec 01 20:22:00 crc kubenswrapper[4802]: I1201 20:22:00.469359 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs" event={"ID":"3d19da6a-24ea-49b2-9415-100c4db7bccd","Type":"ContainerStarted","Data":"7a4a1a0864be80e25e346507f6fb25cb3c4ad99abda57c0c8e9d94811953c300"} Dec 01 20:22:00 crc kubenswrapper[4802]: I1201 20:22:00.469960 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs" event={"ID":"3d19da6a-24ea-49b2-9415-100c4db7bccd","Type":"ContainerStarted","Data":"1346dda3a1c5ffee678d5495b2a8e90ba9e76c4ec45f74be4a9fe96d5aa5eacb"} Dec 01 20:22:00 crc kubenswrapper[4802]: I1201 20:22:00.471934 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d078b34a-6a2a-4ea0-b7c8-c99ff6942170","Type":"ContainerStarted","Data":"5bb506e6e358797b7b184c988b28d6be8d6bda3b9d5c99ca140777e159c08010"} Dec 01 20:22:00 crc kubenswrapper[4802]: I1201 20:22:00.472279 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:22:00 crc kubenswrapper[4802]: I1201 20:22:00.492500 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs" podStartSLOduration=1.786873263 podStartE2EDuration="2.492477193s" podCreationTimestamp="2025-12-01 20:21:58 +0000 UTC" firstStartedPulling="2025-12-01 20:21:59.467631318 +0000 UTC m=+1541.030190959" lastFinishedPulling="2025-12-01 20:22:00.173235258 +0000 UTC m=+1541.735794889" observedRunningTime="2025-12-01 20:22:00.484961061 +0000 UTC m=+1542.047520702" watchObservedRunningTime="2025-12-01 20:22:00.492477193 +0000 UTC m=+1542.055036844" Dec 01 20:22:00 crc kubenswrapper[4802]: I1201 20:22:00.534269 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.534251618 podStartE2EDuration="37.534251618s" podCreationTimestamp="2025-12-01 20:21:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:22:00.523282778 +0000 UTC m=+1542.085842429" watchObservedRunningTime="2025-12-01 20:22:00.534251618 +0000 UTC m=+1542.096811259" Dec 01 20:22:11 crc kubenswrapper[4802]: I1201 20:22:11.720225 4802 scope.go:117] "RemoveContainer" containerID="ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619" Dec 01 20:22:11 crc kubenswrapper[4802]: E1201 20:22:11.720889 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:22:13 crc kubenswrapper[4802]: I1201 20:22:13.390866 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 01 20:22:14 crc kubenswrapper[4802]: I1201 20:22:14.449533 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 01 20:22:20 crc kubenswrapper[4802]: I1201 20:22:20.082169 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ch975"] Dec 01 20:22:20 crc kubenswrapper[4802]: I1201 20:22:20.085766 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ch975" Dec 01 20:22:20 crc kubenswrapper[4802]: I1201 20:22:20.095427 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ch975"] Dec 01 20:22:20 crc kubenswrapper[4802]: I1201 20:22:20.210734 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67b415a5-3b6d-4974-9812-66cfaa40455f-utilities\") pod \"certified-operators-ch975\" (UID: \"67b415a5-3b6d-4974-9812-66cfaa40455f\") " pod="openshift-marketplace/certified-operators-ch975" Dec 01 20:22:20 crc kubenswrapper[4802]: I1201 20:22:20.211097 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67b415a5-3b6d-4974-9812-66cfaa40455f-catalog-content\") pod \"certified-operators-ch975\" (UID: \"67b415a5-3b6d-4974-9812-66cfaa40455f\") " pod="openshift-marketplace/certified-operators-ch975" Dec 01 20:22:20 crc kubenswrapper[4802]: I1201 20:22:20.211294 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwxmz\" (UniqueName: \"kubernetes.io/projected/67b415a5-3b6d-4974-9812-66cfaa40455f-kube-api-access-mwxmz\") pod \"certified-operators-ch975\" (UID: \"67b415a5-3b6d-4974-9812-66cfaa40455f\") " pod="openshift-marketplace/certified-operators-ch975" Dec 01 20:22:20 crc kubenswrapper[4802]: I1201 20:22:20.312646 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwxmz\" (UniqueName: \"kubernetes.io/projected/67b415a5-3b6d-4974-9812-66cfaa40455f-kube-api-access-mwxmz\") pod \"certified-operators-ch975\" (UID: \"67b415a5-3b6d-4974-9812-66cfaa40455f\") " pod="openshift-marketplace/certified-operators-ch975" Dec 01 20:22:20 crc kubenswrapper[4802]: I1201 20:22:20.312759 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67b415a5-3b6d-4974-9812-66cfaa40455f-utilities\") pod \"certified-operators-ch975\" (UID: \"67b415a5-3b6d-4974-9812-66cfaa40455f\") " pod="openshift-marketplace/certified-operators-ch975" Dec 01 20:22:20 crc kubenswrapper[4802]: I1201 20:22:20.312792 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67b415a5-3b6d-4974-9812-66cfaa40455f-catalog-content\") pod \"certified-operators-ch975\" (UID: \"67b415a5-3b6d-4974-9812-66cfaa40455f\") " pod="openshift-marketplace/certified-operators-ch975" Dec 01 20:22:20 crc kubenswrapper[4802]: I1201 20:22:20.313341 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67b415a5-3b6d-4974-9812-66cfaa40455f-utilities\") pod \"certified-operators-ch975\" (UID: \"67b415a5-3b6d-4974-9812-66cfaa40455f\") " pod="openshift-marketplace/certified-operators-ch975" Dec 01 20:22:20 crc kubenswrapper[4802]: I1201 20:22:20.313446 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67b415a5-3b6d-4974-9812-66cfaa40455f-catalog-content\") pod \"certified-operators-ch975\" (UID: \"67b415a5-3b6d-4974-9812-66cfaa40455f\") " pod="openshift-marketplace/certified-operators-ch975" Dec 01 20:22:20 crc kubenswrapper[4802]: I1201 20:22:20.333432 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwxmz\" (UniqueName: \"kubernetes.io/projected/67b415a5-3b6d-4974-9812-66cfaa40455f-kube-api-access-mwxmz\") pod \"certified-operators-ch975\" (UID: \"67b415a5-3b6d-4974-9812-66cfaa40455f\") " pod="openshift-marketplace/certified-operators-ch975" Dec 01 20:22:20 crc kubenswrapper[4802]: I1201 20:22:20.421819 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ch975" Dec 01 20:22:20 crc kubenswrapper[4802]: I1201 20:22:20.931241 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ch975"] Dec 01 20:22:21 crc kubenswrapper[4802]: I1201 20:22:21.688642 4802 generic.go:334] "Generic (PLEG): container finished" podID="67b415a5-3b6d-4974-9812-66cfaa40455f" containerID="aca3d03cc788587b480d2dcf13d80669f9bd7c70b35376578cae350a0c2e9029" exitCode=0 Dec 01 20:22:21 crc kubenswrapper[4802]: I1201 20:22:21.688891 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ch975" event={"ID":"67b415a5-3b6d-4974-9812-66cfaa40455f","Type":"ContainerDied","Data":"aca3d03cc788587b480d2dcf13d80669f9bd7c70b35376578cae350a0c2e9029"} Dec 01 20:22:21 crc kubenswrapper[4802]: I1201 20:22:21.688921 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ch975" event={"ID":"67b415a5-3b6d-4974-9812-66cfaa40455f","Type":"ContainerStarted","Data":"f176558b688a9451825318a16920ecf76b6356f5cce39974a8babe079fd2b784"} Dec 01 20:22:22 crc kubenswrapper[4802]: I1201 20:22:22.720163 4802 scope.go:117] "RemoveContainer" containerID="ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619" Dec 01 20:22:22 crc kubenswrapper[4802]: E1201 20:22:22.720706 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:22:23 crc kubenswrapper[4802]: I1201 20:22:23.709560 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ch975" event={"ID":"67b415a5-3b6d-4974-9812-66cfaa40455f","Type":"ContainerStarted","Data":"131437239ce5094de4d6bb39141e0cbfa9b86fa1bd1461d8d2db194d48d7fabf"} Dec 01 20:22:24 crc kubenswrapper[4802]: I1201 20:22:24.721610 4802 generic.go:334] "Generic (PLEG): container finished" podID="67b415a5-3b6d-4974-9812-66cfaa40455f" containerID="131437239ce5094de4d6bb39141e0cbfa9b86fa1bd1461d8d2db194d48d7fabf" exitCode=0 Dec 01 20:22:24 crc kubenswrapper[4802]: I1201 20:22:24.749811 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ch975" event={"ID":"67b415a5-3b6d-4974-9812-66cfaa40455f","Type":"ContainerDied","Data":"131437239ce5094de4d6bb39141e0cbfa9b86fa1bd1461d8d2db194d48d7fabf"} Dec 01 20:22:25 crc kubenswrapper[4802]: I1201 20:22:25.733815 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ch975" event={"ID":"67b415a5-3b6d-4974-9812-66cfaa40455f","Type":"ContainerStarted","Data":"7939d1562596956dda36452e97bd7c7dee2f8cf6f36be36e6988fdc7a100dc39"} Dec 01 20:22:25 crc kubenswrapper[4802]: I1201 20:22:25.759496 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ch975" podStartSLOduration=2.001735393 podStartE2EDuration="5.759472348s" podCreationTimestamp="2025-12-01 20:22:20 +0000 UTC" firstStartedPulling="2025-12-01 20:22:21.691105937 +0000 UTC m=+1563.253665578" lastFinishedPulling="2025-12-01 20:22:25.448842892 +0000 UTC m=+1567.011402533" observedRunningTime="2025-12-01 20:22:25.752420799 +0000 UTC m=+1567.314980440" watchObservedRunningTime="2025-12-01 20:22:25.759472348 +0000 UTC m=+1567.322031979" Dec 01 20:22:30 crc kubenswrapper[4802]: I1201 20:22:30.422563 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ch975" Dec 01 20:22:30 crc kubenswrapper[4802]: I1201 20:22:30.423043 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ch975" Dec 01 20:22:30 crc kubenswrapper[4802]: I1201 20:22:30.476490 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ch975" Dec 01 20:22:30 crc kubenswrapper[4802]: I1201 20:22:30.858040 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ch975" Dec 01 20:22:30 crc kubenswrapper[4802]: I1201 20:22:30.907828 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ch975"] Dec 01 20:22:32 crc kubenswrapper[4802]: I1201 20:22:32.805567 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ch975" podUID="67b415a5-3b6d-4974-9812-66cfaa40455f" containerName="registry-server" containerID="cri-o://7939d1562596956dda36452e97bd7c7dee2f8cf6f36be36e6988fdc7a100dc39" gracePeriod=2 Dec 01 20:22:33 crc kubenswrapper[4802]: I1201 20:22:33.284226 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ch975" Dec 01 20:22:33 crc kubenswrapper[4802]: I1201 20:22:33.341337 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwxmz\" (UniqueName: \"kubernetes.io/projected/67b415a5-3b6d-4974-9812-66cfaa40455f-kube-api-access-mwxmz\") pod \"67b415a5-3b6d-4974-9812-66cfaa40455f\" (UID: \"67b415a5-3b6d-4974-9812-66cfaa40455f\") " Dec 01 20:22:33 crc kubenswrapper[4802]: I1201 20:22:33.341503 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67b415a5-3b6d-4974-9812-66cfaa40455f-utilities\") pod \"67b415a5-3b6d-4974-9812-66cfaa40455f\" (UID: \"67b415a5-3b6d-4974-9812-66cfaa40455f\") " Dec 01 20:22:33 crc kubenswrapper[4802]: I1201 20:22:33.341561 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67b415a5-3b6d-4974-9812-66cfaa40455f-catalog-content\") pod \"67b415a5-3b6d-4974-9812-66cfaa40455f\" (UID: \"67b415a5-3b6d-4974-9812-66cfaa40455f\") " Dec 01 20:22:33 crc kubenswrapper[4802]: I1201 20:22:33.342341 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67b415a5-3b6d-4974-9812-66cfaa40455f-utilities" (OuterVolumeSpecName: "utilities") pod "67b415a5-3b6d-4974-9812-66cfaa40455f" (UID: "67b415a5-3b6d-4974-9812-66cfaa40455f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:22:33 crc kubenswrapper[4802]: I1201 20:22:33.347276 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67b415a5-3b6d-4974-9812-66cfaa40455f-kube-api-access-mwxmz" (OuterVolumeSpecName: "kube-api-access-mwxmz") pod "67b415a5-3b6d-4974-9812-66cfaa40455f" (UID: "67b415a5-3b6d-4974-9812-66cfaa40455f"). InnerVolumeSpecName "kube-api-access-mwxmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:22:33 crc kubenswrapper[4802]: I1201 20:22:33.431014 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67b415a5-3b6d-4974-9812-66cfaa40455f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "67b415a5-3b6d-4974-9812-66cfaa40455f" (UID: "67b415a5-3b6d-4974-9812-66cfaa40455f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:22:33 crc kubenswrapper[4802]: I1201 20:22:33.443729 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwxmz\" (UniqueName: \"kubernetes.io/projected/67b415a5-3b6d-4974-9812-66cfaa40455f-kube-api-access-mwxmz\") on node \"crc\" DevicePath \"\"" Dec 01 20:22:33 crc kubenswrapper[4802]: I1201 20:22:33.443765 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67b415a5-3b6d-4974-9812-66cfaa40455f-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 20:22:33 crc kubenswrapper[4802]: I1201 20:22:33.443777 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67b415a5-3b6d-4974-9812-66cfaa40455f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 20:22:33 crc kubenswrapper[4802]: I1201 20:22:33.823789 4802 generic.go:334] "Generic (PLEG): container finished" podID="67b415a5-3b6d-4974-9812-66cfaa40455f" containerID="7939d1562596956dda36452e97bd7c7dee2f8cf6f36be36e6988fdc7a100dc39" exitCode=0 Dec 01 20:22:33 crc kubenswrapper[4802]: I1201 20:22:33.823832 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ch975" event={"ID":"67b415a5-3b6d-4974-9812-66cfaa40455f","Type":"ContainerDied","Data":"7939d1562596956dda36452e97bd7c7dee2f8cf6f36be36e6988fdc7a100dc39"} Dec 01 20:22:33 crc kubenswrapper[4802]: I1201 20:22:33.823857 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ch975" event={"ID":"67b415a5-3b6d-4974-9812-66cfaa40455f","Type":"ContainerDied","Data":"f176558b688a9451825318a16920ecf76b6356f5cce39974a8babe079fd2b784"} Dec 01 20:22:33 crc kubenswrapper[4802]: I1201 20:22:33.823876 4802 scope.go:117] "RemoveContainer" containerID="7939d1562596956dda36452e97bd7c7dee2f8cf6f36be36e6988fdc7a100dc39" Dec 01 20:22:33 crc kubenswrapper[4802]: I1201 20:22:33.823885 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ch975" Dec 01 20:22:33 crc kubenswrapper[4802]: I1201 20:22:33.864439 4802 scope.go:117] "RemoveContainer" containerID="131437239ce5094de4d6bb39141e0cbfa9b86fa1bd1461d8d2db194d48d7fabf" Dec 01 20:22:33 crc kubenswrapper[4802]: I1201 20:22:33.874081 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ch975"] Dec 01 20:22:33 crc kubenswrapper[4802]: I1201 20:22:33.884657 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ch975"] Dec 01 20:22:33 crc kubenswrapper[4802]: I1201 20:22:33.893281 4802 scope.go:117] "RemoveContainer" containerID="aca3d03cc788587b480d2dcf13d80669f9bd7c70b35376578cae350a0c2e9029" Dec 01 20:22:33 crc kubenswrapper[4802]: I1201 20:22:33.950122 4802 scope.go:117] "RemoveContainer" containerID="7939d1562596956dda36452e97bd7c7dee2f8cf6f36be36e6988fdc7a100dc39" Dec 01 20:22:33 crc kubenswrapper[4802]: E1201 20:22:33.951126 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7939d1562596956dda36452e97bd7c7dee2f8cf6f36be36e6988fdc7a100dc39\": container with ID starting with 7939d1562596956dda36452e97bd7c7dee2f8cf6f36be36e6988fdc7a100dc39 not found: ID does not exist" containerID="7939d1562596956dda36452e97bd7c7dee2f8cf6f36be36e6988fdc7a100dc39" Dec 01 20:22:33 crc kubenswrapper[4802]: I1201 20:22:33.951170 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7939d1562596956dda36452e97bd7c7dee2f8cf6f36be36e6988fdc7a100dc39"} err="failed to get container status \"7939d1562596956dda36452e97bd7c7dee2f8cf6f36be36e6988fdc7a100dc39\": rpc error: code = NotFound desc = could not find container \"7939d1562596956dda36452e97bd7c7dee2f8cf6f36be36e6988fdc7a100dc39\": container with ID starting with 7939d1562596956dda36452e97bd7c7dee2f8cf6f36be36e6988fdc7a100dc39 not found: ID does not exist" Dec 01 20:22:33 crc kubenswrapper[4802]: I1201 20:22:33.951361 4802 scope.go:117] "RemoveContainer" containerID="131437239ce5094de4d6bb39141e0cbfa9b86fa1bd1461d8d2db194d48d7fabf" Dec 01 20:22:33 crc kubenswrapper[4802]: E1201 20:22:33.951845 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"131437239ce5094de4d6bb39141e0cbfa9b86fa1bd1461d8d2db194d48d7fabf\": container with ID starting with 131437239ce5094de4d6bb39141e0cbfa9b86fa1bd1461d8d2db194d48d7fabf not found: ID does not exist" containerID="131437239ce5094de4d6bb39141e0cbfa9b86fa1bd1461d8d2db194d48d7fabf" Dec 01 20:22:33 crc kubenswrapper[4802]: I1201 20:22:33.951881 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131437239ce5094de4d6bb39141e0cbfa9b86fa1bd1461d8d2db194d48d7fabf"} err="failed to get container status \"131437239ce5094de4d6bb39141e0cbfa9b86fa1bd1461d8d2db194d48d7fabf\": rpc error: code = NotFound desc = could not find container \"131437239ce5094de4d6bb39141e0cbfa9b86fa1bd1461d8d2db194d48d7fabf\": container with ID starting with 131437239ce5094de4d6bb39141e0cbfa9b86fa1bd1461d8d2db194d48d7fabf not found: ID does not exist" Dec 01 20:22:33 crc kubenswrapper[4802]: I1201 20:22:33.951901 4802 scope.go:117] "RemoveContainer" containerID="aca3d03cc788587b480d2dcf13d80669f9bd7c70b35376578cae350a0c2e9029" Dec 01 20:22:33 crc kubenswrapper[4802]: E1201 20:22:33.952402 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aca3d03cc788587b480d2dcf13d80669f9bd7c70b35376578cae350a0c2e9029\": container with ID starting with aca3d03cc788587b480d2dcf13d80669f9bd7c70b35376578cae350a0c2e9029 not found: ID does not exist" containerID="aca3d03cc788587b480d2dcf13d80669f9bd7c70b35376578cae350a0c2e9029" Dec 01 20:22:33 crc kubenswrapper[4802]: I1201 20:22:33.952431 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aca3d03cc788587b480d2dcf13d80669f9bd7c70b35376578cae350a0c2e9029"} err="failed to get container status \"aca3d03cc788587b480d2dcf13d80669f9bd7c70b35376578cae350a0c2e9029\": rpc error: code = NotFound desc = could not find container \"aca3d03cc788587b480d2dcf13d80669f9bd7c70b35376578cae350a0c2e9029\": container with ID starting with aca3d03cc788587b480d2dcf13d80669f9bd7c70b35376578cae350a0c2e9029 not found: ID does not exist" Dec 01 20:22:34 crc kubenswrapper[4802]: I1201 20:22:34.720990 4802 scope.go:117] "RemoveContainer" containerID="ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619" Dec 01 20:22:34 crc kubenswrapper[4802]: E1201 20:22:34.721238 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:22:34 crc kubenswrapper[4802]: I1201 20:22:34.736010 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67b415a5-3b6d-4974-9812-66cfaa40455f" path="/var/lib/kubelet/pods/67b415a5-3b6d-4974-9812-66cfaa40455f/volumes" Dec 01 20:22:47 crc kubenswrapper[4802]: I1201 20:22:47.525274 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mr28p"] Dec 01 20:22:47 crc kubenswrapper[4802]: E1201 20:22:47.526600 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67b415a5-3b6d-4974-9812-66cfaa40455f" containerName="extract-content" Dec 01 20:22:47 crc kubenswrapper[4802]: I1201 20:22:47.526625 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="67b415a5-3b6d-4974-9812-66cfaa40455f" containerName="extract-content" Dec 01 20:22:47 crc kubenswrapper[4802]: E1201 20:22:47.526649 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67b415a5-3b6d-4974-9812-66cfaa40455f" containerName="registry-server" Dec 01 20:22:47 crc kubenswrapper[4802]: I1201 20:22:47.526658 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="67b415a5-3b6d-4974-9812-66cfaa40455f" containerName="registry-server" Dec 01 20:22:47 crc kubenswrapper[4802]: E1201 20:22:47.526674 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67b415a5-3b6d-4974-9812-66cfaa40455f" containerName="extract-utilities" Dec 01 20:22:47 crc kubenswrapper[4802]: I1201 20:22:47.526687 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="67b415a5-3b6d-4974-9812-66cfaa40455f" containerName="extract-utilities" Dec 01 20:22:47 crc kubenswrapper[4802]: I1201 20:22:47.527010 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="67b415a5-3b6d-4974-9812-66cfaa40455f" containerName="registry-server" Dec 01 20:22:47 crc kubenswrapper[4802]: I1201 20:22:47.530325 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mr28p" Dec 01 20:22:47 crc kubenswrapper[4802]: I1201 20:22:47.542085 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mr28p"] Dec 01 20:22:47 crc kubenswrapper[4802]: I1201 20:22:47.620740 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf171699-24d6-4436-9ac2-1e4dc47a3e30-utilities\") pod \"redhat-marketplace-mr28p\" (UID: \"cf171699-24d6-4436-9ac2-1e4dc47a3e30\") " pod="openshift-marketplace/redhat-marketplace-mr28p" Dec 01 20:22:47 crc kubenswrapper[4802]: I1201 20:22:47.621181 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnmqs\" (UniqueName: \"kubernetes.io/projected/cf171699-24d6-4436-9ac2-1e4dc47a3e30-kube-api-access-pnmqs\") pod \"redhat-marketplace-mr28p\" (UID: \"cf171699-24d6-4436-9ac2-1e4dc47a3e30\") " pod="openshift-marketplace/redhat-marketplace-mr28p" Dec 01 20:22:47 crc kubenswrapper[4802]: I1201 20:22:47.621402 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf171699-24d6-4436-9ac2-1e4dc47a3e30-catalog-content\") pod \"redhat-marketplace-mr28p\" (UID: \"cf171699-24d6-4436-9ac2-1e4dc47a3e30\") " pod="openshift-marketplace/redhat-marketplace-mr28p" Dec 01 20:22:47 crc kubenswrapper[4802]: I1201 20:22:47.723753 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf171699-24d6-4436-9ac2-1e4dc47a3e30-utilities\") pod \"redhat-marketplace-mr28p\" (UID: \"cf171699-24d6-4436-9ac2-1e4dc47a3e30\") " pod="openshift-marketplace/redhat-marketplace-mr28p" Dec 01 20:22:47 crc kubenswrapper[4802]: I1201 20:22:47.723838 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnmqs\" (UniqueName: \"kubernetes.io/projected/cf171699-24d6-4436-9ac2-1e4dc47a3e30-kube-api-access-pnmqs\") pod \"redhat-marketplace-mr28p\" (UID: \"cf171699-24d6-4436-9ac2-1e4dc47a3e30\") " pod="openshift-marketplace/redhat-marketplace-mr28p" Dec 01 20:22:47 crc kubenswrapper[4802]: I1201 20:22:47.723896 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf171699-24d6-4436-9ac2-1e4dc47a3e30-catalog-content\") pod \"redhat-marketplace-mr28p\" (UID: \"cf171699-24d6-4436-9ac2-1e4dc47a3e30\") " pod="openshift-marketplace/redhat-marketplace-mr28p" Dec 01 20:22:47 crc kubenswrapper[4802]: I1201 20:22:47.724418 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf171699-24d6-4436-9ac2-1e4dc47a3e30-utilities\") pod \"redhat-marketplace-mr28p\" (UID: \"cf171699-24d6-4436-9ac2-1e4dc47a3e30\") " pod="openshift-marketplace/redhat-marketplace-mr28p" Dec 01 20:22:47 crc kubenswrapper[4802]: I1201 20:22:47.724513 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf171699-24d6-4436-9ac2-1e4dc47a3e30-catalog-content\") pod \"redhat-marketplace-mr28p\" (UID: \"cf171699-24d6-4436-9ac2-1e4dc47a3e30\") " pod="openshift-marketplace/redhat-marketplace-mr28p" Dec 01 20:22:47 crc kubenswrapper[4802]: I1201 20:22:47.745964 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnmqs\" (UniqueName: \"kubernetes.io/projected/cf171699-24d6-4436-9ac2-1e4dc47a3e30-kube-api-access-pnmqs\") pod \"redhat-marketplace-mr28p\" (UID: \"cf171699-24d6-4436-9ac2-1e4dc47a3e30\") " pod="openshift-marketplace/redhat-marketplace-mr28p" Dec 01 20:22:47 crc kubenswrapper[4802]: I1201 20:22:47.871111 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mr28p" Dec 01 20:22:48 crc kubenswrapper[4802]: I1201 20:22:48.364645 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mr28p"] Dec 01 20:22:48 crc kubenswrapper[4802]: I1201 20:22:48.990704 4802 generic.go:334] "Generic (PLEG): container finished" podID="cf171699-24d6-4436-9ac2-1e4dc47a3e30" containerID="277749cb4e5a45f9d5239feb56fb7c6a008a18f79fda2e8fcb5140655d5d1a4c" exitCode=0 Dec 01 20:22:48 crc kubenswrapper[4802]: I1201 20:22:48.990796 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mr28p" event={"ID":"cf171699-24d6-4436-9ac2-1e4dc47a3e30","Type":"ContainerDied","Data":"277749cb4e5a45f9d5239feb56fb7c6a008a18f79fda2e8fcb5140655d5d1a4c"} Dec 01 20:22:48 crc kubenswrapper[4802]: I1201 20:22:48.991228 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mr28p" event={"ID":"cf171699-24d6-4436-9ac2-1e4dc47a3e30","Type":"ContainerStarted","Data":"45b368bd71908c5cd0b5cfeaa45c65f64c763bfbf94b820b3ea0723b688c3618"} Dec 01 20:22:49 crc kubenswrapper[4802]: I1201 20:22:49.720000 4802 scope.go:117] "RemoveContainer" containerID="ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619" Dec 01 20:22:49 crc kubenswrapper[4802]: E1201 20:22:49.722121 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:22:50 crc kubenswrapper[4802]: I1201 20:22:50.017827 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mr28p" event={"ID":"cf171699-24d6-4436-9ac2-1e4dc47a3e30","Type":"ContainerStarted","Data":"762e580693ed707df89fea796fce42e4c501a1fdd04815ccfdf2c6a69ab0e368"} Dec 01 20:22:51 crc kubenswrapper[4802]: I1201 20:22:51.027663 4802 generic.go:334] "Generic (PLEG): container finished" podID="cf171699-24d6-4436-9ac2-1e4dc47a3e30" containerID="762e580693ed707df89fea796fce42e4c501a1fdd04815ccfdf2c6a69ab0e368" exitCode=0 Dec 01 20:22:51 crc kubenswrapper[4802]: I1201 20:22:51.027722 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mr28p" event={"ID":"cf171699-24d6-4436-9ac2-1e4dc47a3e30","Type":"ContainerDied","Data":"762e580693ed707df89fea796fce42e4c501a1fdd04815ccfdf2c6a69ab0e368"} Dec 01 20:22:52 crc kubenswrapper[4802]: I1201 20:22:52.041866 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mr28p" event={"ID":"cf171699-24d6-4436-9ac2-1e4dc47a3e30","Type":"ContainerStarted","Data":"d3c2bca82e0cf3bea5f297192001caa7fa847e0b62349078d495cbac62ed0e51"} Dec 01 20:22:52 crc kubenswrapper[4802]: I1201 20:22:52.071489 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mr28p" podStartSLOduration=2.308305629 podStartE2EDuration="5.071464884s" podCreationTimestamp="2025-12-01 20:22:47 +0000 UTC" firstStartedPulling="2025-12-01 20:22:48.993924794 +0000 UTC m=+1590.556484435" lastFinishedPulling="2025-12-01 20:22:51.757084049 +0000 UTC m=+1593.319643690" observedRunningTime="2025-12-01 20:22:52.068707306 +0000 UTC m=+1593.631266957" watchObservedRunningTime="2025-12-01 20:22:52.071464884 +0000 UTC m=+1593.634024535" Dec 01 20:22:57 crc kubenswrapper[4802]: I1201 20:22:57.871564 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mr28p" Dec 01 20:22:57 crc kubenswrapper[4802]: I1201 20:22:57.872157 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mr28p" Dec 01 20:22:57 crc kubenswrapper[4802]: I1201 20:22:57.942409 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mr28p" Dec 01 20:22:58 crc kubenswrapper[4802]: I1201 20:22:58.202430 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mr28p" Dec 01 20:22:58 crc kubenswrapper[4802]: I1201 20:22:58.259532 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mr28p"] Dec 01 20:23:00 crc kubenswrapper[4802]: I1201 20:23:00.135604 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mr28p" podUID="cf171699-24d6-4436-9ac2-1e4dc47a3e30" containerName="registry-server" containerID="cri-o://d3c2bca82e0cf3bea5f297192001caa7fa847e0b62349078d495cbac62ed0e51" gracePeriod=2 Dec 01 20:23:00 crc kubenswrapper[4802]: I1201 20:23:00.719893 4802 scope.go:117] "RemoveContainer" containerID="ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619" Dec 01 20:23:00 crc kubenswrapper[4802]: E1201 20:23:00.720421 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:23:00 crc kubenswrapper[4802]: I1201 20:23:00.723846 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mr28p" Dec 01 20:23:00 crc kubenswrapper[4802]: I1201 20:23:00.857685 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf171699-24d6-4436-9ac2-1e4dc47a3e30-catalog-content\") pod \"cf171699-24d6-4436-9ac2-1e4dc47a3e30\" (UID: \"cf171699-24d6-4436-9ac2-1e4dc47a3e30\") " Dec 01 20:23:00 crc kubenswrapper[4802]: I1201 20:23:00.857760 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf171699-24d6-4436-9ac2-1e4dc47a3e30-utilities\") pod \"cf171699-24d6-4436-9ac2-1e4dc47a3e30\" (UID: \"cf171699-24d6-4436-9ac2-1e4dc47a3e30\") " Dec 01 20:23:00 crc kubenswrapper[4802]: I1201 20:23:00.857911 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnmqs\" (UniqueName: \"kubernetes.io/projected/cf171699-24d6-4436-9ac2-1e4dc47a3e30-kube-api-access-pnmqs\") pod \"cf171699-24d6-4436-9ac2-1e4dc47a3e30\" (UID: \"cf171699-24d6-4436-9ac2-1e4dc47a3e30\") " Dec 01 20:23:00 crc kubenswrapper[4802]: I1201 20:23:00.859029 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf171699-24d6-4436-9ac2-1e4dc47a3e30-utilities" (OuterVolumeSpecName: "utilities") pod "cf171699-24d6-4436-9ac2-1e4dc47a3e30" (UID: "cf171699-24d6-4436-9ac2-1e4dc47a3e30"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:23:00 crc kubenswrapper[4802]: I1201 20:23:00.863765 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf171699-24d6-4436-9ac2-1e4dc47a3e30-kube-api-access-pnmqs" (OuterVolumeSpecName: "kube-api-access-pnmqs") pod "cf171699-24d6-4436-9ac2-1e4dc47a3e30" (UID: "cf171699-24d6-4436-9ac2-1e4dc47a3e30"). InnerVolumeSpecName "kube-api-access-pnmqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:23:00 crc kubenswrapper[4802]: I1201 20:23:00.891544 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf171699-24d6-4436-9ac2-1e4dc47a3e30-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf171699-24d6-4436-9ac2-1e4dc47a3e30" (UID: "cf171699-24d6-4436-9ac2-1e4dc47a3e30"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:23:00 crc kubenswrapper[4802]: I1201 20:23:00.960560 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf171699-24d6-4436-9ac2-1e4dc47a3e30-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 20:23:00 crc kubenswrapper[4802]: I1201 20:23:00.960603 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf171699-24d6-4436-9ac2-1e4dc47a3e30-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 20:23:00 crc kubenswrapper[4802]: I1201 20:23:00.960618 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnmqs\" (UniqueName: \"kubernetes.io/projected/cf171699-24d6-4436-9ac2-1e4dc47a3e30-kube-api-access-pnmqs\") on node \"crc\" DevicePath \"\"" Dec 01 20:23:01 crc kubenswrapper[4802]: I1201 20:23:01.146439 4802 generic.go:334] "Generic (PLEG): container finished" podID="cf171699-24d6-4436-9ac2-1e4dc47a3e30" containerID="d3c2bca82e0cf3bea5f297192001caa7fa847e0b62349078d495cbac62ed0e51" exitCode=0 Dec 01 20:23:01 crc kubenswrapper[4802]: I1201 20:23:01.146707 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mr28p" event={"ID":"cf171699-24d6-4436-9ac2-1e4dc47a3e30","Type":"ContainerDied","Data":"d3c2bca82e0cf3bea5f297192001caa7fa847e0b62349078d495cbac62ed0e51"} Dec 01 20:23:01 crc kubenswrapper[4802]: I1201 20:23:01.146761 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mr28p" event={"ID":"cf171699-24d6-4436-9ac2-1e4dc47a3e30","Type":"ContainerDied","Data":"45b368bd71908c5cd0b5cfeaa45c65f64c763bfbf94b820b3ea0723b688c3618"} Dec 01 20:23:01 crc kubenswrapper[4802]: I1201 20:23:01.146779 4802 scope.go:117] "RemoveContainer" containerID="d3c2bca82e0cf3bea5f297192001caa7fa847e0b62349078d495cbac62ed0e51" Dec 01 20:23:01 crc kubenswrapper[4802]: I1201 20:23:01.146830 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mr28p" Dec 01 20:23:01 crc kubenswrapper[4802]: I1201 20:23:01.174110 4802 scope.go:117] "RemoveContainer" containerID="762e580693ed707df89fea796fce42e4c501a1fdd04815ccfdf2c6a69ab0e368" Dec 01 20:23:01 crc kubenswrapper[4802]: I1201 20:23:01.199332 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mr28p"] Dec 01 20:23:01 crc kubenswrapper[4802]: I1201 20:23:01.210687 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mr28p"] Dec 01 20:23:01 crc kubenswrapper[4802]: I1201 20:23:01.218946 4802 scope.go:117] "RemoveContainer" containerID="277749cb4e5a45f9d5239feb56fb7c6a008a18f79fda2e8fcb5140655d5d1a4c" Dec 01 20:23:01 crc kubenswrapper[4802]: I1201 20:23:01.266758 4802 scope.go:117] "RemoveContainer" containerID="d3c2bca82e0cf3bea5f297192001caa7fa847e0b62349078d495cbac62ed0e51" Dec 01 20:23:01 crc kubenswrapper[4802]: E1201 20:23:01.267157 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3c2bca82e0cf3bea5f297192001caa7fa847e0b62349078d495cbac62ed0e51\": container with ID starting with d3c2bca82e0cf3bea5f297192001caa7fa847e0b62349078d495cbac62ed0e51 not found: ID does not exist" containerID="d3c2bca82e0cf3bea5f297192001caa7fa847e0b62349078d495cbac62ed0e51" Dec 01 20:23:01 crc kubenswrapper[4802]: I1201 20:23:01.267289 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3c2bca82e0cf3bea5f297192001caa7fa847e0b62349078d495cbac62ed0e51"} err="failed to get container status \"d3c2bca82e0cf3bea5f297192001caa7fa847e0b62349078d495cbac62ed0e51\": rpc error: code = NotFound desc = could not find container \"d3c2bca82e0cf3bea5f297192001caa7fa847e0b62349078d495cbac62ed0e51\": container with ID starting with d3c2bca82e0cf3bea5f297192001caa7fa847e0b62349078d495cbac62ed0e51 not found: ID does not exist" Dec 01 20:23:01 crc kubenswrapper[4802]: I1201 20:23:01.267373 4802 scope.go:117] "RemoveContainer" containerID="762e580693ed707df89fea796fce42e4c501a1fdd04815ccfdf2c6a69ab0e368" Dec 01 20:23:01 crc kubenswrapper[4802]: E1201 20:23:01.267702 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"762e580693ed707df89fea796fce42e4c501a1fdd04815ccfdf2c6a69ab0e368\": container with ID starting with 762e580693ed707df89fea796fce42e4c501a1fdd04815ccfdf2c6a69ab0e368 not found: ID does not exist" containerID="762e580693ed707df89fea796fce42e4c501a1fdd04815ccfdf2c6a69ab0e368" Dec 01 20:23:01 crc kubenswrapper[4802]: I1201 20:23:01.267732 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"762e580693ed707df89fea796fce42e4c501a1fdd04815ccfdf2c6a69ab0e368"} err="failed to get container status \"762e580693ed707df89fea796fce42e4c501a1fdd04815ccfdf2c6a69ab0e368\": rpc error: code = NotFound desc = could not find container \"762e580693ed707df89fea796fce42e4c501a1fdd04815ccfdf2c6a69ab0e368\": container with ID starting with 762e580693ed707df89fea796fce42e4c501a1fdd04815ccfdf2c6a69ab0e368 not found: ID does not exist" Dec 01 20:23:01 crc kubenswrapper[4802]: I1201 20:23:01.267754 4802 scope.go:117] "RemoveContainer" containerID="277749cb4e5a45f9d5239feb56fb7c6a008a18f79fda2e8fcb5140655d5d1a4c" Dec 01 20:23:01 crc kubenswrapper[4802]: E1201 20:23:01.267943 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"277749cb4e5a45f9d5239feb56fb7c6a008a18f79fda2e8fcb5140655d5d1a4c\": container with ID starting with 277749cb4e5a45f9d5239feb56fb7c6a008a18f79fda2e8fcb5140655d5d1a4c not found: ID does not exist" containerID="277749cb4e5a45f9d5239feb56fb7c6a008a18f79fda2e8fcb5140655d5d1a4c" Dec 01 20:23:01 crc kubenswrapper[4802]: I1201 20:23:01.268029 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"277749cb4e5a45f9d5239feb56fb7c6a008a18f79fda2e8fcb5140655d5d1a4c"} err="failed to get container status \"277749cb4e5a45f9d5239feb56fb7c6a008a18f79fda2e8fcb5140655d5d1a4c\": rpc error: code = NotFound desc = could not find container \"277749cb4e5a45f9d5239feb56fb7c6a008a18f79fda2e8fcb5140655d5d1a4c\": container with ID starting with 277749cb4e5a45f9d5239feb56fb7c6a008a18f79fda2e8fcb5140655d5d1a4c not found: ID does not exist" Dec 01 20:23:02 crc kubenswrapper[4802]: I1201 20:23:02.732776 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf171699-24d6-4436-9ac2-1e4dc47a3e30" path="/var/lib/kubelet/pods/cf171699-24d6-4436-9ac2-1e4dc47a3e30/volumes" Dec 01 20:23:11 crc kubenswrapper[4802]: I1201 20:23:11.720108 4802 scope.go:117] "RemoveContainer" containerID="ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619" Dec 01 20:23:11 crc kubenswrapper[4802]: E1201 20:23:11.720843 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:23:23 crc kubenswrapper[4802]: I1201 20:23:23.599626 4802 scope.go:117] "RemoveContainer" containerID="229722e7a59e4b76f8fb2b1ae757a602669ff8478bd89105ba1637846c35b8c9" Dec 01 20:23:23 crc kubenswrapper[4802]: I1201 20:23:23.720866 4802 scope.go:117] "RemoveContainer" containerID="ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619" Dec 01 20:23:23 crc kubenswrapper[4802]: E1201 20:23:23.721552 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:23:35 crc kubenswrapper[4802]: I1201 20:23:35.721180 4802 scope.go:117] "RemoveContainer" containerID="ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619" Dec 01 20:23:35 crc kubenswrapper[4802]: E1201 20:23:35.722590 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:23:49 crc kubenswrapper[4802]: I1201 20:23:49.719892 4802 scope.go:117] "RemoveContainer" containerID="ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619" Dec 01 20:23:49 crc kubenswrapper[4802]: E1201 20:23:49.721798 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:24:03 crc kubenswrapper[4802]: I1201 20:24:03.720523 4802 scope.go:117] "RemoveContainer" containerID="ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619" Dec 01 20:24:03 crc kubenswrapper[4802]: E1201 20:24:03.721624 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:24:15 crc kubenswrapper[4802]: I1201 20:24:15.720175 4802 scope.go:117] "RemoveContainer" containerID="ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619" Dec 01 20:24:15 crc kubenswrapper[4802]: E1201 20:24:15.720939 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:24:23 crc kubenswrapper[4802]: I1201 20:24:23.736604 4802 scope.go:117] "RemoveContainer" containerID="579401e1051ba58c5e9e4efd000ea6777646162f95955108bf93a6abab07e95b" Dec 01 20:24:28 crc kubenswrapper[4802]: I1201 20:24:28.730811 4802 scope.go:117] "RemoveContainer" containerID="ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619" Dec 01 20:24:28 crc kubenswrapper[4802]: E1201 20:24:28.732237 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:24:43 crc kubenswrapper[4802]: I1201 20:24:43.720018 4802 scope.go:117] "RemoveContainer" containerID="ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619" Dec 01 20:24:43 crc kubenswrapper[4802]: E1201 20:24:43.721062 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:24:54 crc kubenswrapper[4802]: I1201 20:24:54.719850 4802 scope.go:117] "RemoveContainer" containerID="ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619" Dec 01 20:24:54 crc kubenswrapper[4802]: E1201 20:24:54.720622 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:25:09 crc kubenswrapper[4802]: I1201 20:25:09.721057 4802 scope.go:117] "RemoveContainer" containerID="ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619" Dec 01 20:25:09 crc kubenswrapper[4802]: E1201 20:25:09.722286 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:25:21 crc kubenswrapper[4802]: I1201 20:25:21.721209 4802 scope.go:117] "RemoveContainer" containerID="ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619" Dec 01 20:25:21 crc kubenswrapper[4802]: E1201 20:25:21.726312 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:25:33 crc kubenswrapper[4802]: I1201 20:25:33.720540 4802 scope.go:117] "RemoveContainer" containerID="ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619" Dec 01 20:25:33 crc kubenswrapper[4802]: E1201 20:25:33.721158 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:25:44 crc kubenswrapper[4802]: I1201 20:25:44.722175 4802 scope.go:117] "RemoveContainer" containerID="ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619" Dec 01 20:25:44 crc kubenswrapper[4802]: E1201 20:25:44.723079 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:25:51 crc kubenswrapper[4802]: I1201 20:25:51.335412 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g6tz2"] Dec 01 20:25:51 crc kubenswrapper[4802]: E1201 20:25:51.336543 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf171699-24d6-4436-9ac2-1e4dc47a3e30" containerName="registry-server" Dec 01 20:25:51 crc kubenswrapper[4802]: I1201 20:25:51.336563 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf171699-24d6-4436-9ac2-1e4dc47a3e30" containerName="registry-server" Dec 01 20:25:51 crc kubenswrapper[4802]: E1201 20:25:51.336588 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf171699-24d6-4436-9ac2-1e4dc47a3e30" containerName="extract-utilities" Dec 01 20:25:51 crc kubenswrapper[4802]: I1201 20:25:51.336596 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf171699-24d6-4436-9ac2-1e4dc47a3e30" containerName="extract-utilities" Dec 01 20:25:51 crc kubenswrapper[4802]: E1201 20:25:51.336612 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf171699-24d6-4436-9ac2-1e4dc47a3e30" containerName="extract-content" Dec 01 20:25:51 crc kubenswrapper[4802]: I1201 20:25:51.336619 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf171699-24d6-4436-9ac2-1e4dc47a3e30" containerName="extract-content" Dec 01 20:25:51 crc kubenswrapper[4802]: I1201 20:25:51.336839 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf171699-24d6-4436-9ac2-1e4dc47a3e30" containerName="registry-server" Dec 01 20:25:51 crc kubenswrapper[4802]: I1201 20:25:51.338464 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g6tz2" Dec 01 20:25:51 crc kubenswrapper[4802]: I1201 20:25:51.346471 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g6tz2"] Dec 01 20:25:51 crc kubenswrapper[4802]: I1201 20:25:51.519101 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e15d8b16-0b44-47d6-af0b-cd08d064c9b0-utilities\") pod \"community-operators-g6tz2\" (UID: \"e15d8b16-0b44-47d6-af0b-cd08d064c9b0\") " pod="openshift-marketplace/community-operators-g6tz2" Dec 01 20:25:51 crc kubenswrapper[4802]: I1201 20:25:51.519302 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls5tn\" (UniqueName: \"kubernetes.io/projected/e15d8b16-0b44-47d6-af0b-cd08d064c9b0-kube-api-access-ls5tn\") pod \"community-operators-g6tz2\" (UID: \"e15d8b16-0b44-47d6-af0b-cd08d064c9b0\") " pod="openshift-marketplace/community-operators-g6tz2" Dec 01 20:25:51 crc kubenswrapper[4802]: I1201 20:25:51.519357 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e15d8b16-0b44-47d6-af0b-cd08d064c9b0-catalog-content\") pod \"community-operators-g6tz2\" (UID: \"e15d8b16-0b44-47d6-af0b-cd08d064c9b0\") " pod="openshift-marketplace/community-operators-g6tz2" Dec 01 20:25:51 crc kubenswrapper[4802]: I1201 20:25:51.620831 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e15d8b16-0b44-47d6-af0b-cd08d064c9b0-utilities\") pod \"community-operators-g6tz2\" (UID: \"e15d8b16-0b44-47d6-af0b-cd08d064c9b0\") " pod="openshift-marketplace/community-operators-g6tz2" Dec 01 20:25:51 crc kubenswrapper[4802]: I1201 20:25:51.620890 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls5tn\" (UniqueName: \"kubernetes.io/projected/e15d8b16-0b44-47d6-af0b-cd08d064c9b0-kube-api-access-ls5tn\") pod \"community-operators-g6tz2\" (UID: \"e15d8b16-0b44-47d6-af0b-cd08d064c9b0\") " pod="openshift-marketplace/community-operators-g6tz2" Dec 01 20:25:51 crc kubenswrapper[4802]: I1201 20:25:51.620914 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e15d8b16-0b44-47d6-af0b-cd08d064c9b0-catalog-content\") pod \"community-operators-g6tz2\" (UID: \"e15d8b16-0b44-47d6-af0b-cd08d064c9b0\") " pod="openshift-marketplace/community-operators-g6tz2" Dec 01 20:25:51 crc kubenswrapper[4802]: I1201 20:25:51.621754 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e15d8b16-0b44-47d6-af0b-cd08d064c9b0-utilities\") pod \"community-operators-g6tz2\" (UID: \"e15d8b16-0b44-47d6-af0b-cd08d064c9b0\") " pod="openshift-marketplace/community-operators-g6tz2" Dec 01 20:25:51 crc kubenswrapper[4802]: I1201 20:25:51.621776 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e15d8b16-0b44-47d6-af0b-cd08d064c9b0-catalog-content\") pod \"community-operators-g6tz2\" (UID: \"e15d8b16-0b44-47d6-af0b-cd08d064c9b0\") " pod="openshift-marketplace/community-operators-g6tz2" Dec 01 20:25:51 crc kubenswrapper[4802]: I1201 20:25:51.644030 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls5tn\" (UniqueName: \"kubernetes.io/projected/e15d8b16-0b44-47d6-af0b-cd08d064c9b0-kube-api-access-ls5tn\") pod \"community-operators-g6tz2\" (UID: \"e15d8b16-0b44-47d6-af0b-cd08d064c9b0\") " pod="openshift-marketplace/community-operators-g6tz2" Dec 01 20:25:51 crc kubenswrapper[4802]: I1201 20:25:51.669895 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g6tz2" Dec 01 20:25:52 crc kubenswrapper[4802]: I1201 20:25:52.198882 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g6tz2"] Dec 01 20:25:52 crc kubenswrapper[4802]: W1201 20:25:52.207435 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode15d8b16_0b44_47d6_af0b_cd08d064c9b0.slice/crio-de9d4b5df187f79143a06e1a77b55ca1d6ed036d38eba7711ff71160b04aba08 WatchSource:0}: Error finding container de9d4b5df187f79143a06e1a77b55ca1d6ed036d38eba7711ff71160b04aba08: Status 404 returned error can't find the container with id de9d4b5df187f79143a06e1a77b55ca1d6ed036d38eba7711ff71160b04aba08 Dec 01 20:25:52 crc kubenswrapper[4802]: I1201 20:25:52.928570 4802 generic.go:334] "Generic (PLEG): container finished" podID="e15d8b16-0b44-47d6-af0b-cd08d064c9b0" containerID="31ce8a3832282438b65c7b3950f5ee83fc87efea2304d1234d98e49a432a2518" exitCode=0 Dec 01 20:25:52 crc kubenswrapper[4802]: I1201 20:25:52.928611 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6tz2" event={"ID":"e15d8b16-0b44-47d6-af0b-cd08d064c9b0","Type":"ContainerDied","Data":"31ce8a3832282438b65c7b3950f5ee83fc87efea2304d1234d98e49a432a2518"} Dec 01 20:25:52 crc kubenswrapper[4802]: I1201 20:25:52.928806 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6tz2" event={"ID":"e15d8b16-0b44-47d6-af0b-cd08d064c9b0","Type":"ContainerStarted","Data":"de9d4b5df187f79143a06e1a77b55ca1d6ed036d38eba7711ff71160b04aba08"} Dec 01 20:25:52 crc kubenswrapper[4802]: I1201 20:25:52.931431 4802 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 20:25:54 crc kubenswrapper[4802]: I1201 20:25:54.949690 4802 generic.go:334] "Generic (PLEG): container finished" podID="e15d8b16-0b44-47d6-af0b-cd08d064c9b0" containerID="1bb874c5ab3bccac8c6a3c407391a8cc7b54336157366086079da936b9847f85" exitCode=0 Dec 01 20:25:54 crc kubenswrapper[4802]: I1201 20:25:54.949743 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6tz2" event={"ID":"e15d8b16-0b44-47d6-af0b-cd08d064c9b0","Type":"ContainerDied","Data":"1bb874c5ab3bccac8c6a3c407391a8cc7b54336157366086079da936b9847f85"} Dec 01 20:25:55 crc kubenswrapper[4802]: I1201 20:25:55.960066 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6tz2" event={"ID":"e15d8b16-0b44-47d6-af0b-cd08d064c9b0","Type":"ContainerStarted","Data":"d8e4f3585055411c75bc165b47a3f17df6b248bf19da095719d8be66a82c1902"} Dec 01 20:25:57 crc kubenswrapper[4802]: I1201 20:25:57.720914 4802 scope.go:117] "RemoveContainer" containerID="ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619" Dec 01 20:25:57 crc kubenswrapper[4802]: E1201 20:25:57.722066 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:25:57 crc kubenswrapper[4802]: I1201 20:25:57.984824 4802 generic.go:334] "Generic (PLEG): container finished" podID="3d19da6a-24ea-49b2-9415-100c4db7bccd" containerID="7a4a1a0864be80e25e346507f6fb25cb3c4ad99abda57c0c8e9d94811953c300" exitCode=0 Dec 01 20:25:57 crc kubenswrapper[4802]: I1201 20:25:57.984931 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs" event={"ID":"3d19da6a-24ea-49b2-9415-100c4db7bccd","Type":"ContainerDied","Data":"7a4a1a0864be80e25e346507f6fb25cb3c4ad99abda57c0c8e9d94811953c300"} Dec 01 20:25:58 crc kubenswrapper[4802]: I1201 20:25:58.019438 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g6tz2" podStartSLOduration=4.553198326 podStartE2EDuration="7.019403401s" podCreationTimestamp="2025-12-01 20:25:51 +0000 UTC" firstStartedPulling="2025-12-01 20:25:52.931213433 +0000 UTC m=+1774.493773074" lastFinishedPulling="2025-12-01 20:25:55.397418478 +0000 UTC m=+1776.959978149" observedRunningTime="2025-12-01 20:25:55.981564121 +0000 UTC m=+1777.544123772" watchObservedRunningTime="2025-12-01 20:25:58.019403401 +0000 UTC m=+1779.581963082" Dec 01 20:25:59 crc kubenswrapper[4802]: I1201 20:25:59.437282 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs" Dec 01 20:25:59 crc kubenswrapper[4802]: I1201 20:25:59.565176 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3d19da6a-24ea-49b2-9415-100c4db7bccd-ssh-key\") pod \"3d19da6a-24ea-49b2-9415-100c4db7bccd\" (UID: \"3d19da6a-24ea-49b2-9415-100c4db7bccd\") " Dec 01 20:25:59 crc kubenswrapper[4802]: I1201 20:25:59.565523 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsjvb\" (UniqueName: \"kubernetes.io/projected/3d19da6a-24ea-49b2-9415-100c4db7bccd-kube-api-access-rsjvb\") pod \"3d19da6a-24ea-49b2-9415-100c4db7bccd\" (UID: \"3d19da6a-24ea-49b2-9415-100c4db7bccd\") " Dec 01 20:25:59 crc kubenswrapper[4802]: I1201 20:25:59.565565 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d19da6a-24ea-49b2-9415-100c4db7bccd-inventory\") pod \"3d19da6a-24ea-49b2-9415-100c4db7bccd\" (UID: \"3d19da6a-24ea-49b2-9415-100c4db7bccd\") " Dec 01 20:25:59 crc kubenswrapper[4802]: I1201 20:25:59.565658 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d19da6a-24ea-49b2-9415-100c4db7bccd-bootstrap-combined-ca-bundle\") pod \"3d19da6a-24ea-49b2-9415-100c4db7bccd\" (UID: \"3d19da6a-24ea-49b2-9415-100c4db7bccd\") " Dec 01 20:25:59 crc kubenswrapper[4802]: I1201 20:25:59.571713 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d19da6a-24ea-49b2-9415-100c4db7bccd-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "3d19da6a-24ea-49b2-9415-100c4db7bccd" (UID: "3d19da6a-24ea-49b2-9415-100c4db7bccd"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:25:59 crc kubenswrapper[4802]: I1201 20:25:59.572897 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d19da6a-24ea-49b2-9415-100c4db7bccd-kube-api-access-rsjvb" (OuterVolumeSpecName: "kube-api-access-rsjvb") pod "3d19da6a-24ea-49b2-9415-100c4db7bccd" (UID: "3d19da6a-24ea-49b2-9415-100c4db7bccd"). InnerVolumeSpecName "kube-api-access-rsjvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:25:59 crc kubenswrapper[4802]: I1201 20:25:59.594217 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d19da6a-24ea-49b2-9415-100c4db7bccd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3d19da6a-24ea-49b2-9415-100c4db7bccd" (UID: "3d19da6a-24ea-49b2-9415-100c4db7bccd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:25:59 crc kubenswrapper[4802]: I1201 20:25:59.595466 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d19da6a-24ea-49b2-9415-100c4db7bccd-inventory" (OuterVolumeSpecName: "inventory") pod "3d19da6a-24ea-49b2-9415-100c4db7bccd" (UID: "3d19da6a-24ea-49b2-9415-100c4db7bccd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:25:59 crc kubenswrapper[4802]: I1201 20:25:59.668160 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsjvb\" (UniqueName: \"kubernetes.io/projected/3d19da6a-24ea-49b2-9415-100c4db7bccd-kube-api-access-rsjvb\") on node \"crc\" DevicePath \"\"" Dec 01 20:25:59 crc kubenswrapper[4802]: I1201 20:25:59.668187 4802 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d19da6a-24ea-49b2-9415-100c4db7bccd-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 20:25:59 crc kubenswrapper[4802]: I1201 20:25:59.668332 4802 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d19da6a-24ea-49b2-9415-100c4db7bccd-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:25:59 crc kubenswrapper[4802]: I1201 20:25:59.668357 4802 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3d19da6a-24ea-49b2-9415-100c4db7bccd-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 20:26:00 crc kubenswrapper[4802]: I1201 20:26:00.007911 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs" event={"ID":"3d19da6a-24ea-49b2-9415-100c4db7bccd","Type":"ContainerDied","Data":"1346dda3a1c5ffee678d5495b2a8e90ba9e76c4ec45f74be4a9fe96d5aa5eacb"} Dec 01 20:26:00 crc kubenswrapper[4802]: I1201 20:26:00.007956 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1346dda3a1c5ffee678d5495b2a8e90ba9e76c4ec45f74be4a9fe96d5aa5eacb" Dec 01 20:26:00 crc kubenswrapper[4802]: I1201 20:26:00.007932 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs" Dec 01 20:26:00 crc kubenswrapper[4802]: I1201 20:26:00.121703 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf"] Dec 01 20:26:00 crc kubenswrapper[4802]: E1201 20:26:00.122128 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d19da6a-24ea-49b2-9415-100c4db7bccd" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 20:26:00 crc kubenswrapper[4802]: I1201 20:26:00.122148 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d19da6a-24ea-49b2-9415-100c4db7bccd" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 20:26:00 crc kubenswrapper[4802]: I1201 20:26:00.122368 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d19da6a-24ea-49b2-9415-100c4db7bccd" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 20:26:00 crc kubenswrapper[4802]: I1201 20:26:00.123069 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf" Dec 01 20:26:00 crc kubenswrapper[4802]: I1201 20:26:00.125451 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 20:26:00 crc kubenswrapper[4802]: I1201 20:26:00.125729 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 20:26:00 crc kubenswrapper[4802]: I1201 20:26:00.125858 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 20:26:00 crc kubenswrapper[4802]: I1201 20:26:00.131469 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wx8v9" Dec 01 20:26:00 crc kubenswrapper[4802]: I1201 20:26:00.133706 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf"] Dec 01 20:26:00 crc kubenswrapper[4802]: I1201 20:26:00.278624 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10691218-713c-4982-979e-cb0dde9077ed-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf\" (UID: \"10691218-713c-4982-979e-cb0dde9077ed\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf" Dec 01 20:26:00 crc kubenswrapper[4802]: I1201 20:26:00.278707 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktzms\" (UniqueName: \"kubernetes.io/projected/10691218-713c-4982-979e-cb0dde9077ed-kube-api-access-ktzms\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf\" (UID: \"10691218-713c-4982-979e-cb0dde9077ed\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf" Dec 01 20:26:00 crc kubenswrapper[4802]: I1201 20:26:00.278903 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10691218-713c-4982-979e-cb0dde9077ed-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf\" (UID: \"10691218-713c-4982-979e-cb0dde9077ed\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf" Dec 01 20:26:00 crc kubenswrapper[4802]: I1201 20:26:00.381344 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10691218-713c-4982-979e-cb0dde9077ed-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf\" (UID: \"10691218-713c-4982-979e-cb0dde9077ed\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf" Dec 01 20:26:00 crc kubenswrapper[4802]: I1201 20:26:00.381500 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktzms\" (UniqueName: \"kubernetes.io/projected/10691218-713c-4982-979e-cb0dde9077ed-kube-api-access-ktzms\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf\" (UID: \"10691218-713c-4982-979e-cb0dde9077ed\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf" Dec 01 20:26:00 crc kubenswrapper[4802]: I1201 20:26:00.381660 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10691218-713c-4982-979e-cb0dde9077ed-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf\" (UID: \"10691218-713c-4982-979e-cb0dde9077ed\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf" Dec 01 20:26:00 crc kubenswrapper[4802]: I1201 20:26:00.388168 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10691218-713c-4982-979e-cb0dde9077ed-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf\" (UID: \"10691218-713c-4982-979e-cb0dde9077ed\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf" Dec 01 20:26:00 crc kubenswrapper[4802]: I1201 20:26:00.388320 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10691218-713c-4982-979e-cb0dde9077ed-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf\" (UID: \"10691218-713c-4982-979e-cb0dde9077ed\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf" Dec 01 20:26:00 crc kubenswrapper[4802]: I1201 20:26:00.402619 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktzms\" (UniqueName: \"kubernetes.io/projected/10691218-713c-4982-979e-cb0dde9077ed-kube-api-access-ktzms\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf\" (UID: \"10691218-713c-4982-979e-cb0dde9077ed\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf" Dec 01 20:26:00 crc kubenswrapper[4802]: I1201 20:26:00.452905 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf" Dec 01 20:26:01 crc kubenswrapper[4802]: I1201 20:26:01.061778 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf"] Dec 01 20:26:01 crc kubenswrapper[4802]: I1201 20:26:01.670469 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g6tz2" Dec 01 20:26:01 crc kubenswrapper[4802]: I1201 20:26:01.670787 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g6tz2" Dec 01 20:26:01 crc kubenswrapper[4802]: I1201 20:26:01.751775 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g6tz2" Dec 01 20:26:02 crc kubenswrapper[4802]: I1201 20:26:02.024121 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf" event={"ID":"10691218-713c-4982-979e-cb0dde9077ed","Type":"ContainerStarted","Data":"76d71391080ac6c1aaac302c26c6522a934e42632bf335cadc6272070bb6ef3e"} Dec 01 20:26:02 crc kubenswrapper[4802]: I1201 20:26:02.024166 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf" event={"ID":"10691218-713c-4982-979e-cb0dde9077ed","Type":"ContainerStarted","Data":"9587b0d74f29e926979b0ede56bdb06042e0d20a5f8a1e30a18cb14180fa87dd"} Dec 01 20:26:02 crc kubenswrapper[4802]: I1201 20:26:02.052022 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf" podStartSLOduration=1.625865177 podStartE2EDuration="2.051996154s" podCreationTimestamp="2025-12-01 20:26:00 +0000 UTC" firstStartedPulling="2025-12-01 20:26:01.079869026 +0000 UTC m=+1782.642428667" lastFinishedPulling="2025-12-01 20:26:01.505999993 +0000 UTC m=+1783.068559644" observedRunningTime="2025-12-01 20:26:02.038659938 +0000 UTC m=+1783.601219589" watchObservedRunningTime="2025-12-01 20:26:02.051996154 +0000 UTC m=+1783.614555795" Dec 01 20:26:02 crc kubenswrapper[4802]: I1201 20:26:02.082449 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g6tz2" Dec 01 20:26:03 crc kubenswrapper[4802]: I1201 20:26:03.126159 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g6tz2"] Dec 01 20:26:04 crc kubenswrapper[4802]: I1201 20:26:04.042017 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g6tz2" podUID="e15d8b16-0b44-47d6-af0b-cd08d064c9b0" containerName="registry-server" containerID="cri-o://d8e4f3585055411c75bc165b47a3f17df6b248bf19da095719d8be66a82c1902" gracePeriod=2 Dec 01 20:26:04 crc kubenswrapper[4802]: I1201 20:26:04.508725 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g6tz2" Dec 01 20:26:04 crc kubenswrapper[4802]: I1201 20:26:04.684111 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls5tn\" (UniqueName: \"kubernetes.io/projected/e15d8b16-0b44-47d6-af0b-cd08d064c9b0-kube-api-access-ls5tn\") pod \"e15d8b16-0b44-47d6-af0b-cd08d064c9b0\" (UID: \"e15d8b16-0b44-47d6-af0b-cd08d064c9b0\") " Dec 01 20:26:04 crc kubenswrapper[4802]: I1201 20:26:04.684257 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e15d8b16-0b44-47d6-af0b-cd08d064c9b0-catalog-content\") pod \"e15d8b16-0b44-47d6-af0b-cd08d064c9b0\" (UID: \"e15d8b16-0b44-47d6-af0b-cd08d064c9b0\") " Dec 01 20:26:04 crc kubenswrapper[4802]: I1201 20:26:04.684393 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e15d8b16-0b44-47d6-af0b-cd08d064c9b0-utilities\") pod \"e15d8b16-0b44-47d6-af0b-cd08d064c9b0\" (UID: \"e15d8b16-0b44-47d6-af0b-cd08d064c9b0\") " Dec 01 20:26:04 crc kubenswrapper[4802]: I1201 20:26:04.693759 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e15d8b16-0b44-47d6-af0b-cd08d064c9b0-utilities" (OuterVolumeSpecName: "utilities") pod "e15d8b16-0b44-47d6-af0b-cd08d064c9b0" (UID: "e15d8b16-0b44-47d6-af0b-cd08d064c9b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:26:04 crc kubenswrapper[4802]: I1201 20:26:04.698011 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e15d8b16-0b44-47d6-af0b-cd08d064c9b0-kube-api-access-ls5tn" (OuterVolumeSpecName: "kube-api-access-ls5tn") pod "e15d8b16-0b44-47d6-af0b-cd08d064c9b0" (UID: "e15d8b16-0b44-47d6-af0b-cd08d064c9b0"). InnerVolumeSpecName "kube-api-access-ls5tn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:26:04 crc kubenswrapper[4802]: I1201 20:26:04.787865 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e15d8b16-0b44-47d6-af0b-cd08d064c9b0-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 20:26:04 crc kubenswrapper[4802]: I1201 20:26:04.787896 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls5tn\" (UniqueName: \"kubernetes.io/projected/e15d8b16-0b44-47d6-af0b-cd08d064c9b0-kube-api-access-ls5tn\") on node \"crc\" DevicePath \"\"" Dec 01 20:26:05 crc kubenswrapper[4802]: I1201 20:26:05.051272 4802 generic.go:334] "Generic (PLEG): container finished" podID="e15d8b16-0b44-47d6-af0b-cd08d064c9b0" containerID="d8e4f3585055411c75bc165b47a3f17df6b248bf19da095719d8be66a82c1902" exitCode=0 Dec 01 20:26:05 crc kubenswrapper[4802]: I1201 20:26:05.051345 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6tz2" event={"ID":"e15d8b16-0b44-47d6-af0b-cd08d064c9b0","Type":"ContainerDied","Data":"d8e4f3585055411c75bc165b47a3f17df6b248bf19da095719d8be66a82c1902"} Dec 01 20:26:05 crc kubenswrapper[4802]: I1201 20:26:05.051365 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g6tz2" Dec 01 20:26:05 crc kubenswrapper[4802]: I1201 20:26:05.051716 4802 scope.go:117] "RemoveContainer" containerID="d8e4f3585055411c75bc165b47a3f17df6b248bf19da095719d8be66a82c1902" Dec 01 20:26:05 crc kubenswrapper[4802]: I1201 20:26:05.051670 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6tz2" event={"ID":"e15d8b16-0b44-47d6-af0b-cd08d064c9b0","Type":"ContainerDied","Data":"de9d4b5df187f79143a06e1a77b55ca1d6ed036d38eba7711ff71160b04aba08"} Dec 01 20:26:05 crc kubenswrapper[4802]: I1201 20:26:05.070329 4802 scope.go:117] "RemoveContainer" containerID="1bb874c5ab3bccac8c6a3c407391a8cc7b54336157366086079da936b9847f85" Dec 01 20:26:05 crc kubenswrapper[4802]: I1201 20:26:05.102138 4802 scope.go:117] "RemoveContainer" containerID="31ce8a3832282438b65c7b3950f5ee83fc87efea2304d1234d98e49a432a2518" Dec 01 20:26:05 crc kubenswrapper[4802]: I1201 20:26:05.149768 4802 scope.go:117] "RemoveContainer" containerID="d8e4f3585055411c75bc165b47a3f17df6b248bf19da095719d8be66a82c1902" Dec 01 20:26:05 crc kubenswrapper[4802]: E1201 20:26:05.150391 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8e4f3585055411c75bc165b47a3f17df6b248bf19da095719d8be66a82c1902\": container with ID starting with d8e4f3585055411c75bc165b47a3f17df6b248bf19da095719d8be66a82c1902 not found: ID does not exist" containerID="d8e4f3585055411c75bc165b47a3f17df6b248bf19da095719d8be66a82c1902" Dec 01 20:26:05 crc kubenswrapper[4802]: I1201 20:26:05.150466 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8e4f3585055411c75bc165b47a3f17df6b248bf19da095719d8be66a82c1902"} err="failed to get container status \"d8e4f3585055411c75bc165b47a3f17df6b248bf19da095719d8be66a82c1902\": rpc error: code = NotFound desc = could not find container \"d8e4f3585055411c75bc165b47a3f17df6b248bf19da095719d8be66a82c1902\": container with ID starting with d8e4f3585055411c75bc165b47a3f17df6b248bf19da095719d8be66a82c1902 not found: ID does not exist" Dec 01 20:26:05 crc kubenswrapper[4802]: I1201 20:26:05.150522 4802 scope.go:117] "RemoveContainer" containerID="1bb874c5ab3bccac8c6a3c407391a8cc7b54336157366086079da936b9847f85" Dec 01 20:26:05 crc kubenswrapper[4802]: E1201 20:26:05.151282 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bb874c5ab3bccac8c6a3c407391a8cc7b54336157366086079da936b9847f85\": container with ID starting with 1bb874c5ab3bccac8c6a3c407391a8cc7b54336157366086079da936b9847f85 not found: ID does not exist" containerID="1bb874c5ab3bccac8c6a3c407391a8cc7b54336157366086079da936b9847f85" Dec 01 20:26:05 crc kubenswrapper[4802]: I1201 20:26:05.151342 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bb874c5ab3bccac8c6a3c407391a8cc7b54336157366086079da936b9847f85"} err="failed to get container status \"1bb874c5ab3bccac8c6a3c407391a8cc7b54336157366086079da936b9847f85\": rpc error: code = NotFound desc = could not find container \"1bb874c5ab3bccac8c6a3c407391a8cc7b54336157366086079da936b9847f85\": container with ID starting with 1bb874c5ab3bccac8c6a3c407391a8cc7b54336157366086079da936b9847f85 not found: ID does not exist" Dec 01 20:26:05 crc kubenswrapper[4802]: I1201 20:26:05.151373 4802 scope.go:117] "RemoveContainer" containerID="31ce8a3832282438b65c7b3950f5ee83fc87efea2304d1234d98e49a432a2518" Dec 01 20:26:05 crc kubenswrapper[4802]: E1201 20:26:05.151767 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31ce8a3832282438b65c7b3950f5ee83fc87efea2304d1234d98e49a432a2518\": container with ID starting with 31ce8a3832282438b65c7b3950f5ee83fc87efea2304d1234d98e49a432a2518 not found: ID does not exist" containerID="31ce8a3832282438b65c7b3950f5ee83fc87efea2304d1234d98e49a432a2518" Dec 01 20:26:05 crc kubenswrapper[4802]: I1201 20:26:05.151797 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31ce8a3832282438b65c7b3950f5ee83fc87efea2304d1234d98e49a432a2518"} err="failed to get container status \"31ce8a3832282438b65c7b3950f5ee83fc87efea2304d1234d98e49a432a2518\": rpc error: code = NotFound desc = could not find container \"31ce8a3832282438b65c7b3950f5ee83fc87efea2304d1234d98e49a432a2518\": container with ID starting with 31ce8a3832282438b65c7b3950f5ee83fc87efea2304d1234d98e49a432a2518 not found: ID does not exist" Dec 01 20:26:05 crc kubenswrapper[4802]: I1201 20:26:05.190529 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e15d8b16-0b44-47d6-af0b-cd08d064c9b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e15d8b16-0b44-47d6-af0b-cd08d064c9b0" (UID: "e15d8b16-0b44-47d6-af0b-cd08d064c9b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:26:05 crc kubenswrapper[4802]: I1201 20:26:05.195392 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e15d8b16-0b44-47d6-af0b-cd08d064c9b0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 20:26:05 crc kubenswrapper[4802]: I1201 20:26:05.395180 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g6tz2"] Dec 01 20:26:05 crc kubenswrapper[4802]: I1201 20:26:05.406706 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g6tz2"] Dec 01 20:26:06 crc kubenswrapper[4802]: I1201 20:26:06.729434 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e15d8b16-0b44-47d6-af0b-cd08d064c9b0" path="/var/lib/kubelet/pods/e15d8b16-0b44-47d6-af0b-cd08d064c9b0/volumes" Dec 01 20:26:09 crc kubenswrapper[4802]: I1201 20:26:09.719934 4802 scope.go:117] "RemoveContainer" containerID="ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619" Dec 01 20:26:09 crc kubenswrapper[4802]: E1201 20:26:09.720591 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:26:23 crc kubenswrapper[4802]: I1201 20:26:23.870350 4802 scope.go:117] "RemoveContainer" containerID="9f5ed18f2689f6b2c76edb1ea0f439d1d49747eba9c2c8b2a28a2fdebc608276" Dec 01 20:26:23 crc kubenswrapper[4802]: I1201 20:26:23.903294 4802 scope.go:117] "RemoveContainer" containerID="fbf442ffa23c05d13159af090a968d0f87022c799004fc540034f4be42509714" Dec 01 20:26:24 crc kubenswrapper[4802]: I1201 20:26:24.720342 4802 scope.go:117] "RemoveContainer" containerID="ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619" Dec 01 20:26:24 crc kubenswrapper[4802]: E1201 20:26:24.720594 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:26:37 crc kubenswrapper[4802]: I1201 20:26:37.720068 4802 scope.go:117] "RemoveContainer" containerID="ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619" Dec 01 20:26:37 crc kubenswrapper[4802]: E1201 20:26:37.722084 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:26:39 crc kubenswrapper[4802]: I1201 20:26:39.034445 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-wbzkj"] Dec 01 20:26:39 crc kubenswrapper[4802]: I1201 20:26:39.043856 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-wbzkj"] Dec 01 20:26:40 crc kubenswrapper[4802]: I1201 20:26:40.024567 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db78-account-create-update-t5b96"] Dec 01 20:26:40 crc kubenswrapper[4802]: I1201 20:26:40.036190 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-rxltz"] Dec 01 20:26:40 crc kubenswrapper[4802]: I1201 20:26:40.045965 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5e7f-account-create-update-dt8m9"] Dec 01 20:26:40 crc kubenswrapper[4802]: I1201 20:26:40.053968 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db78-account-create-update-t5b96"] Dec 01 20:26:40 crc kubenswrapper[4802]: I1201 20:26:40.075188 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-rxltz"] Dec 01 20:26:40 crc kubenswrapper[4802]: I1201 20:26:40.085331 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5e7f-account-create-update-dt8m9"] Dec 01 20:26:40 crc kubenswrapper[4802]: I1201 20:26:40.730480 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5842d665-da28-4918-abcc-106446b09206" path="/var/lib/kubelet/pods/5842d665-da28-4918-abcc-106446b09206/volumes" Dec 01 20:26:40 crc kubenswrapper[4802]: I1201 20:26:40.731137 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5918a7fb-54b8-45ff-9f5d-0c86f93553fe" path="/var/lib/kubelet/pods/5918a7fb-54b8-45ff-9f5d-0c86f93553fe/volumes" Dec 01 20:26:40 crc kubenswrapper[4802]: I1201 20:26:40.731670 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="703d4a88-3292-40bc-871c-6e87449826d0" path="/var/lib/kubelet/pods/703d4a88-3292-40bc-871c-6e87449826d0/volumes" Dec 01 20:26:40 crc kubenswrapper[4802]: I1201 20:26:40.732229 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aebb6147-155e-4021-88e7-19d2f1c2ffff" path="/var/lib/kubelet/pods/aebb6147-155e-4021-88e7-19d2f1c2ffff/volumes" Dec 01 20:26:46 crc kubenswrapper[4802]: I1201 20:26:46.038689 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-e376-account-create-update-rz2t8"] Dec 01 20:26:46 crc kubenswrapper[4802]: I1201 20:26:46.053631 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-7mqtg"] Dec 01 20:26:46 crc kubenswrapper[4802]: I1201 20:26:46.062258 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-e376-account-create-update-rz2t8"] Dec 01 20:26:46 crc kubenswrapper[4802]: I1201 20:26:46.069222 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-7mqtg"] Dec 01 20:26:46 crc kubenswrapper[4802]: I1201 20:26:46.741034 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c488df2-71d2-4e82-9ac1-224ad92f0744" path="/var/lib/kubelet/pods/1c488df2-71d2-4e82-9ac1-224ad92f0744/volumes" Dec 01 20:26:46 crc kubenswrapper[4802]: I1201 20:26:46.742830 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c415ac94-c3a7-4e60-953f-748556482cc6" path="/var/lib/kubelet/pods/c415ac94-c3a7-4e60-953f-748556482cc6/volumes" Dec 01 20:26:51 crc kubenswrapper[4802]: I1201 20:26:51.719933 4802 scope.go:117] "RemoveContainer" containerID="ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619" Dec 01 20:26:51 crc kubenswrapper[4802]: E1201 20:26:51.720739 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:27:06 crc kubenswrapper[4802]: I1201 20:27:06.721034 4802 scope.go:117] "RemoveContainer" containerID="ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619" Dec 01 20:27:07 crc kubenswrapper[4802]: I1201 20:27:07.683689 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerStarted","Data":"26cf5483ade0dbc95267728676ce0ccc960449233741d088558ac81f81de5fa8"} Dec 01 20:27:14 crc kubenswrapper[4802]: I1201 20:27:14.763520 4802 generic.go:334] "Generic (PLEG): container finished" podID="10691218-713c-4982-979e-cb0dde9077ed" containerID="76d71391080ac6c1aaac302c26c6522a934e42632bf335cadc6272070bb6ef3e" exitCode=0 Dec 01 20:27:14 crc kubenswrapper[4802]: I1201 20:27:14.764245 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf" event={"ID":"10691218-713c-4982-979e-cb0dde9077ed","Type":"ContainerDied","Data":"76d71391080ac6c1aaac302c26c6522a934e42632bf335cadc6272070bb6ef3e"} Dec 01 20:27:16 crc kubenswrapper[4802]: I1201 20:27:16.186615 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf" Dec 01 20:27:16 crc kubenswrapper[4802]: I1201 20:27:16.292435 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10691218-713c-4982-979e-cb0dde9077ed-inventory\") pod \"10691218-713c-4982-979e-cb0dde9077ed\" (UID: \"10691218-713c-4982-979e-cb0dde9077ed\") " Dec 01 20:27:16 crc kubenswrapper[4802]: I1201 20:27:16.292506 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktzms\" (UniqueName: \"kubernetes.io/projected/10691218-713c-4982-979e-cb0dde9077ed-kube-api-access-ktzms\") pod \"10691218-713c-4982-979e-cb0dde9077ed\" (UID: \"10691218-713c-4982-979e-cb0dde9077ed\") " Dec 01 20:27:16 crc kubenswrapper[4802]: I1201 20:27:16.292531 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10691218-713c-4982-979e-cb0dde9077ed-ssh-key\") pod \"10691218-713c-4982-979e-cb0dde9077ed\" (UID: \"10691218-713c-4982-979e-cb0dde9077ed\") " Dec 01 20:27:16 crc kubenswrapper[4802]: I1201 20:27:16.297617 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10691218-713c-4982-979e-cb0dde9077ed-kube-api-access-ktzms" (OuterVolumeSpecName: "kube-api-access-ktzms") pod "10691218-713c-4982-979e-cb0dde9077ed" (UID: "10691218-713c-4982-979e-cb0dde9077ed"). InnerVolumeSpecName "kube-api-access-ktzms". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:27:16 crc kubenswrapper[4802]: I1201 20:27:16.317473 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10691218-713c-4982-979e-cb0dde9077ed-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "10691218-713c-4982-979e-cb0dde9077ed" (UID: "10691218-713c-4982-979e-cb0dde9077ed"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:27:16 crc kubenswrapper[4802]: I1201 20:27:16.324513 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10691218-713c-4982-979e-cb0dde9077ed-inventory" (OuterVolumeSpecName: "inventory") pod "10691218-713c-4982-979e-cb0dde9077ed" (UID: "10691218-713c-4982-979e-cb0dde9077ed"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:27:16 crc kubenswrapper[4802]: I1201 20:27:16.394257 4802 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10691218-713c-4982-979e-cb0dde9077ed-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 20:27:16 crc kubenswrapper[4802]: I1201 20:27:16.394291 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktzms\" (UniqueName: \"kubernetes.io/projected/10691218-713c-4982-979e-cb0dde9077ed-kube-api-access-ktzms\") on node \"crc\" DevicePath \"\"" Dec 01 20:27:16 crc kubenswrapper[4802]: I1201 20:27:16.394303 4802 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10691218-713c-4982-979e-cb0dde9077ed-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 20:27:16 crc kubenswrapper[4802]: I1201 20:27:16.786223 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf" Dec 01 20:27:16 crc kubenswrapper[4802]: I1201 20:27:16.786284 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf" event={"ID":"10691218-713c-4982-979e-cb0dde9077ed","Type":"ContainerDied","Data":"9587b0d74f29e926979b0ede56bdb06042e0d20a5f8a1e30a18cb14180fa87dd"} Dec 01 20:27:16 crc kubenswrapper[4802]: I1201 20:27:16.786684 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9587b0d74f29e926979b0ede56bdb06042e0d20a5f8a1e30a18cb14180fa87dd" Dec 01 20:27:16 crc kubenswrapper[4802]: I1201 20:27:16.899614 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gxg67"] Dec 01 20:27:16 crc kubenswrapper[4802]: E1201 20:27:16.900079 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e15d8b16-0b44-47d6-af0b-cd08d064c9b0" containerName="extract-content" Dec 01 20:27:16 crc kubenswrapper[4802]: I1201 20:27:16.900100 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="e15d8b16-0b44-47d6-af0b-cd08d064c9b0" containerName="extract-content" Dec 01 20:27:16 crc kubenswrapper[4802]: E1201 20:27:16.900136 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e15d8b16-0b44-47d6-af0b-cd08d064c9b0" containerName="registry-server" Dec 01 20:27:16 crc kubenswrapper[4802]: I1201 20:27:16.900145 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="e15d8b16-0b44-47d6-af0b-cd08d064c9b0" containerName="registry-server" Dec 01 20:27:16 crc kubenswrapper[4802]: E1201 20:27:16.900163 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10691218-713c-4982-979e-cb0dde9077ed" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 20:27:16 crc kubenswrapper[4802]: I1201 20:27:16.900172 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="10691218-713c-4982-979e-cb0dde9077ed" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 20:27:16 crc kubenswrapper[4802]: E1201 20:27:16.900210 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e15d8b16-0b44-47d6-af0b-cd08d064c9b0" containerName="extract-utilities" Dec 01 20:27:16 crc kubenswrapper[4802]: I1201 20:27:16.900218 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="e15d8b16-0b44-47d6-af0b-cd08d064c9b0" containerName="extract-utilities" Dec 01 20:27:16 crc kubenswrapper[4802]: I1201 20:27:16.900460 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="e15d8b16-0b44-47d6-af0b-cd08d064c9b0" containerName="registry-server" Dec 01 20:27:16 crc kubenswrapper[4802]: I1201 20:27:16.900491 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="10691218-713c-4982-979e-cb0dde9077ed" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 20:27:16 crc kubenswrapper[4802]: I1201 20:27:16.901242 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gxg67" Dec 01 20:27:16 crc kubenswrapper[4802]: I1201 20:27:16.903758 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wx8v9" Dec 01 20:27:16 crc kubenswrapper[4802]: I1201 20:27:16.903966 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 20:27:16 crc kubenswrapper[4802]: I1201 20:27:16.905004 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 20:27:16 crc kubenswrapper[4802]: I1201 20:27:16.905666 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 20:27:16 crc kubenswrapper[4802]: I1201 20:27:16.916608 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gxg67"] Dec 01 20:27:17 crc kubenswrapper[4802]: I1201 20:27:17.006135 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm5mq\" (UniqueName: \"kubernetes.io/projected/b86803dd-2f31-412c-bb40-44235c925362-kube-api-access-pm5mq\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gxg67\" (UID: \"b86803dd-2f31-412c-bb40-44235c925362\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gxg67" Dec 01 20:27:17 crc kubenswrapper[4802]: I1201 20:27:17.006505 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b86803dd-2f31-412c-bb40-44235c925362-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gxg67\" (UID: \"b86803dd-2f31-412c-bb40-44235c925362\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gxg67" Dec 01 20:27:17 crc kubenswrapper[4802]: I1201 20:27:17.006655 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b86803dd-2f31-412c-bb40-44235c925362-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gxg67\" (UID: \"b86803dd-2f31-412c-bb40-44235c925362\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gxg67" Dec 01 20:27:17 crc kubenswrapper[4802]: I1201 20:27:17.108484 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm5mq\" (UniqueName: \"kubernetes.io/projected/b86803dd-2f31-412c-bb40-44235c925362-kube-api-access-pm5mq\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gxg67\" (UID: \"b86803dd-2f31-412c-bb40-44235c925362\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gxg67" Dec 01 20:27:17 crc kubenswrapper[4802]: I1201 20:27:17.108595 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b86803dd-2f31-412c-bb40-44235c925362-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gxg67\" (UID: \"b86803dd-2f31-412c-bb40-44235c925362\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gxg67" Dec 01 20:27:17 crc kubenswrapper[4802]: I1201 20:27:17.108665 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b86803dd-2f31-412c-bb40-44235c925362-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gxg67\" (UID: \"b86803dd-2f31-412c-bb40-44235c925362\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gxg67" Dec 01 20:27:17 crc kubenswrapper[4802]: I1201 20:27:17.115677 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b86803dd-2f31-412c-bb40-44235c925362-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gxg67\" (UID: \"b86803dd-2f31-412c-bb40-44235c925362\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gxg67" Dec 01 20:27:17 crc kubenswrapper[4802]: I1201 20:27:17.119303 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b86803dd-2f31-412c-bb40-44235c925362-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gxg67\" (UID: \"b86803dd-2f31-412c-bb40-44235c925362\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gxg67" Dec 01 20:27:17 crc kubenswrapper[4802]: I1201 20:27:17.137682 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm5mq\" (UniqueName: \"kubernetes.io/projected/b86803dd-2f31-412c-bb40-44235c925362-kube-api-access-pm5mq\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gxg67\" (UID: \"b86803dd-2f31-412c-bb40-44235c925362\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gxg67" Dec 01 20:27:17 crc kubenswrapper[4802]: I1201 20:27:17.228819 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gxg67" Dec 01 20:27:17 crc kubenswrapper[4802]: I1201 20:27:17.734935 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gxg67"] Dec 01 20:27:17 crc kubenswrapper[4802]: W1201 20:27:17.748175 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb86803dd_2f31_412c_bb40_44235c925362.slice/crio-245574a0667e3ac06c2491fe4aa483e5e7d2571f9eb9d9a19c6282f344300ed3 WatchSource:0}: Error finding container 245574a0667e3ac06c2491fe4aa483e5e7d2571f9eb9d9a19c6282f344300ed3: Status 404 returned error can't find the container with id 245574a0667e3ac06c2491fe4aa483e5e7d2571f9eb9d9a19c6282f344300ed3 Dec 01 20:27:17 crc kubenswrapper[4802]: I1201 20:27:17.798742 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gxg67" event={"ID":"b86803dd-2f31-412c-bb40-44235c925362","Type":"ContainerStarted","Data":"245574a0667e3ac06c2491fe4aa483e5e7d2571f9eb9d9a19c6282f344300ed3"} Dec 01 20:27:18 crc kubenswrapper[4802]: I1201 20:27:18.042866 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-mhtn2"] Dec 01 20:27:18 crc kubenswrapper[4802]: I1201 20:27:18.050982 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-mhtn2"] Dec 01 20:27:18 crc kubenswrapper[4802]: I1201 20:27:18.736292 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0448f2b2-4e07-4fa4-85ea-0c37f6664dcd" path="/var/lib/kubelet/pods/0448f2b2-4e07-4fa4-85ea-0c37f6664dcd/volumes" Dec 01 20:27:19 crc kubenswrapper[4802]: I1201 20:27:19.036159 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-26c0-account-create-update-pmhgj"] Dec 01 20:27:19 crc kubenswrapper[4802]: I1201 20:27:19.044975 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-hcph5"] Dec 01 20:27:19 crc kubenswrapper[4802]: I1201 20:27:19.052098 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-5c267"] Dec 01 20:27:19 crc kubenswrapper[4802]: I1201 20:27:19.059476 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-26c0-account-create-update-pmhgj"] Dec 01 20:27:19 crc kubenswrapper[4802]: I1201 20:27:19.065676 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-5c267"] Dec 01 20:27:19 crc kubenswrapper[4802]: I1201 20:27:19.071911 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-hcph5"] Dec 01 20:27:19 crc kubenswrapper[4802]: I1201 20:27:19.815996 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gxg67" event={"ID":"b86803dd-2f31-412c-bb40-44235c925362","Type":"ContainerStarted","Data":"0012d1bb733d38e8d278c8063965362705e5efff7b52466a5e7bd185ef65b802"} Dec 01 20:27:19 crc kubenswrapper[4802]: I1201 20:27:19.830786 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gxg67" podStartSLOduration=3.023766466 podStartE2EDuration="3.830763824s" podCreationTimestamp="2025-12-01 20:27:16 +0000 UTC" firstStartedPulling="2025-12-01 20:27:17.751947893 +0000 UTC m=+1859.314507534" lastFinishedPulling="2025-12-01 20:27:18.558945251 +0000 UTC m=+1860.121504892" observedRunningTime="2025-12-01 20:27:19.829413331 +0000 UTC m=+1861.391972982" watchObservedRunningTime="2025-12-01 20:27:19.830763824 +0000 UTC m=+1861.393323475" Dec 01 20:27:20 crc kubenswrapper[4802]: I1201 20:27:20.734371 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b5c1ebd-d03b-4914-a8fd-498a3cd06581" path="/var/lib/kubelet/pods/0b5c1ebd-d03b-4914-a8fd-498a3cd06581/volumes" Dec 01 20:27:20 crc kubenswrapper[4802]: I1201 20:27:20.736976 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30c7ca3c-3346-442b-b9c8-54b8beb87e8e" path="/var/lib/kubelet/pods/30c7ca3c-3346-442b-b9c8-54b8beb87e8e/volumes" Dec 01 20:27:20 crc kubenswrapper[4802]: I1201 20:27:20.738458 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80050873-f4d6-4468-ba9c-7f5090932b88" path="/var/lib/kubelet/pods/80050873-f4d6-4468-ba9c-7f5090932b88/volumes" Dec 01 20:27:22 crc kubenswrapper[4802]: I1201 20:27:22.044549 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-371f-account-create-update-8gtw9"] Dec 01 20:27:22 crc kubenswrapper[4802]: I1201 20:27:22.058979 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-rsnxk"] Dec 01 20:27:22 crc kubenswrapper[4802]: I1201 20:27:22.074586 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-4560-account-create-update-256g2"] Dec 01 20:27:22 crc kubenswrapper[4802]: I1201 20:27:22.085654 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-rsnxk"] Dec 01 20:27:22 crc kubenswrapper[4802]: I1201 20:27:22.095925 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-371f-account-create-update-8gtw9"] Dec 01 20:27:22 crc kubenswrapper[4802]: I1201 20:27:22.104899 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-4560-account-create-update-256g2"] Dec 01 20:27:22 crc kubenswrapper[4802]: I1201 20:27:22.731525 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f7b6d98-cecf-4243-b6b9-b217eb229549" path="/var/lib/kubelet/pods/3f7b6d98-cecf-4243-b6b9-b217eb229549/volumes" Dec 01 20:27:22 crc kubenswrapper[4802]: I1201 20:27:22.732411 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4551c7b2-d51a-45d2-a645-58bd0da669c5" path="/var/lib/kubelet/pods/4551c7b2-d51a-45d2-a645-58bd0da669c5/volumes" Dec 01 20:27:22 crc kubenswrapper[4802]: I1201 20:27:22.732960 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a" path="/var/lib/kubelet/pods/7cff0cc2-bdb2-4b4d-8b91-51a4021dcc2a/volumes" Dec 01 20:27:23 crc kubenswrapper[4802]: I1201 20:27:23.855918 4802 generic.go:334] "Generic (PLEG): container finished" podID="b86803dd-2f31-412c-bb40-44235c925362" containerID="0012d1bb733d38e8d278c8063965362705e5efff7b52466a5e7bd185ef65b802" exitCode=0 Dec 01 20:27:23 crc kubenswrapper[4802]: I1201 20:27:23.855963 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gxg67" event={"ID":"b86803dd-2f31-412c-bb40-44235c925362","Type":"ContainerDied","Data":"0012d1bb733d38e8d278c8063965362705e5efff7b52466a5e7bd185ef65b802"} Dec 01 20:27:23 crc kubenswrapper[4802]: I1201 20:27:23.994873 4802 scope.go:117] "RemoveContainer" containerID="6340a4eb621303ea2058895ebb55c94ffff1788ceb040708bd567bc75f12f3fb" Dec 01 20:27:24 crc kubenswrapper[4802]: I1201 20:27:24.016468 4802 scope.go:117] "RemoveContainer" containerID="a46b6644d49134838ae8f1b2058232568f9524ff723e1943b9f2222077639a79" Dec 01 20:27:24 crc kubenswrapper[4802]: I1201 20:27:24.064625 4802 scope.go:117] "RemoveContainer" containerID="6072232d7c220a592ce53e9e8abe53cc3a9c007c8c9d7210e98e1559e95f746b" Dec 01 20:27:24 crc kubenswrapper[4802]: I1201 20:27:24.099385 4802 scope.go:117] "RemoveContainer" containerID="199ca1f92b4971d21f75865bd3b37a30f91dd77268f1281d6980f1e1f07e2410" Dec 01 20:27:24 crc kubenswrapper[4802]: I1201 20:27:24.132629 4802 scope.go:117] "RemoveContainer" containerID="55cf11a93b945ea13e7ce60da4d4057335a0570510ca7db223ef02378401fd9d" Dec 01 20:27:24 crc kubenswrapper[4802]: I1201 20:27:24.169336 4802 scope.go:117] "RemoveContainer" containerID="e8cbb1c3d378658de37cdab71495281c580ad159bbdc2ad6b4b0be535009c990" Dec 01 20:27:24 crc kubenswrapper[4802]: I1201 20:27:24.196590 4802 scope.go:117] "RemoveContainer" containerID="72a32f2baec80aaaadb56bd8ad611201a57f152a27790c639c20bf86d685e53e" Dec 01 20:27:24 crc kubenswrapper[4802]: I1201 20:27:24.215279 4802 scope.go:117] "RemoveContainer" containerID="d6353307613dfc077353cf064cdc66614f634042a3d0013764a4524c8f6257f6" Dec 01 20:27:24 crc kubenswrapper[4802]: I1201 20:27:24.231680 4802 scope.go:117] "RemoveContainer" containerID="c6ec4adbd9890cafa668a95709aa21fd99a23e7486e2fdf2b6dd85856a007083" Dec 01 20:27:24 crc kubenswrapper[4802]: I1201 20:27:24.252721 4802 scope.go:117] "RemoveContainer" containerID="443cb30caf2603f77867072fbf034809cacd5027ad575f34ed8d81c3d32b05d7" Dec 01 20:27:24 crc kubenswrapper[4802]: I1201 20:27:24.275389 4802 scope.go:117] "RemoveContainer" containerID="78750f258f78bf0823a33af908db5c118b068bf4427190a3f0d55001944ebb63" Dec 01 20:27:24 crc kubenswrapper[4802]: I1201 20:27:24.295034 4802 scope.go:117] "RemoveContainer" containerID="9dcce808985a3fd15a2baeaa92a62b9648efe6baecafa319ea7e7138de935a7c" Dec 01 20:27:24 crc kubenswrapper[4802]: I1201 20:27:24.315731 4802 scope.go:117] "RemoveContainer" containerID="5a196846fc76864e8912d6493edb4c1b3c7899dd992ec1455f2b53c61650588e" Dec 01 20:27:25 crc kubenswrapper[4802]: I1201 20:27:25.263491 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gxg67" Dec 01 20:27:25 crc kubenswrapper[4802]: I1201 20:27:25.369932 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b86803dd-2f31-412c-bb40-44235c925362-ssh-key\") pod \"b86803dd-2f31-412c-bb40-44235c925362\" (UID: \"b86803dd-2f31-412c-bb40-44235c925362\") " Dec 01 20:27:25 crc kubenswrapper[4802]: I1201 20:27:25.370034 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b86803dd-2f31-412c-bb40-44235c925362-inventory\") pod \"b86803dd-2f31-412c-bb40-44235c925362\" (UID: \"b86803dd-2f31-412c-bb40-44235c925362\") " Dec 01 20:27:25 crc kubenswrapper[4802]: I1201 20:27:25.370086 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm5mq\" (UniqueName: \"kubernetes.io/projected/b86803dd-2f31-412c-bb40-44235c925362-kube-api-access-pm5mq\") pod \"b86803dd-2f31-412c-bb40-44235c925362\" (UID: \"b86803dd-2f31-412c-bb40-44235c925362\") " Dec 01 20:27:25 crc kubenswrapper[4802]: I1201 20:27:25.375246 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b86803dd-2f31-412c-bb40-44235c925362-kube-api-access-pm5mq" (OuterVolumeSpecName: "kube-api-access-pm5mq") pod "b86803dd-2f31-412c-bb40-44235c925362" (UID: "b86803dd-2f31-412c-bb40-44235c925362"). InnerVolumeSpecName "kube-api-access-pm5mq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:27:25 crc kubenswrapper[4802]: I1201 20:27:25.398084 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b86803dd-2f31-412c-bb40-44235c925362-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b86803dd-2f31-412c-bb40-44235c925362" (UID: "b86803dd-2f31-412c-bb40-44235c925362"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:27:25 crc kubenswrapper[4802]: I1201 20:27:25.400093 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b86803dd-2f31-412c-bb40-44235c925362-inventory" (OuterVolumeSpecName: "inventory") pod "b86803dd-2f31-412c-bb40-44235c925362" (UID: "b86803dd-2f31-412c-bb40-44235c925362"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:27:25 crc kubenswrapper[4802]: I1201 20:27:25.472429 4802 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b86803dd-2f31-412c-bb40-44235c925362-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 20:27:25 crc kubenswrapper[4802]: I1201 20:27:25.472580 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm5mq\" (UniqueName: \"kubernetes.io/projected/b86803dd-2f31-412c-bb40-44235c925362-kube-api-access-pm5mq\") on node \"crc\" DevicePath \"\"" Dec 01 20:27:25 crc kubenswrapper[4802]: I1201 20:27:25.472643 4802 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b86803dd-2f31-412c-bb40-44235c925362-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 20:27:25 crc kubenswrapper[4802]: I1201 20:27:25.874459 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gxg67" event={"ID":"b86803dd-2f31-412c-bb40-44235c925362","Type":"ContainerDied","Data":"245574a0667e3ac06c2491fe4aa483e5e7d2571f9eb9d9a19c6282f344300ed3"} Dec 01 20:27:25 crc kubenswrapper[4802]: I1201 20:27:25.874764 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="245574a0667e3ac06c2491fe4aa483e5e7d2571f9eb9d9a19c6282f344300ed3" Dec 01 20:27:25 crc kubenswrapper[4802]: I1201 20:27:25.874541 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gxg67" Dec 01 20:27:25 crc kubenswrapper[4802]: I1201 20:27:25.958432 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-jfldg"] Dec 01 20:27:25 crc kubenswrapper[4802]: E1201 20:27:25.958886 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b86803dd-2f31-412c-bb40-44235c925362" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 20:27:25 crc kubenswrapper[4802]: I1201 20:27:25.958907 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="b86803dd-2f31-412c-bb40-44235c925362" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 20:27:25 crc kubenswrapper[4802]: I1201 20:27:25.959097 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="b86803dd-2f31-412c-bb40-44235c925362" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 20:27:25 crc kubenswrapper[4802]: I1201 20:27:25.959835 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jfldg" Dec 01 20:27:25 crc kubenswrapper[4802]: I1201 20:27:25.962518 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wx8v9" Dec 01 20:27:25 crc kubenswrapper[4802]: I1201 20:27:25.962626 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 20:27:25 crc kubenswrapper[4802]: I1201 20:27:25.962683 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 20:27:25 crc kubenswrapper[4802]: I1201 20:27:25.966008 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 20:27:25 crc kubenswrapper[4802]: I1201 20:27:25.968882 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-jfldg"] Dec 01 20:27:26 crc kubenswrapper[4802]: I1201 20:27:26.085422 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e509bad-279a-4da8-9f77-06123708e579-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jfldg\" (UID: \"4e509bad-279a-4da8-9f77-06123708e579\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jfldg" Dec 01 20:27:26 crc kubenswrapper[4802]: I1201 20:27:26.085532 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n25p\" (UniqueName: \"kubernetes.io/projected/4e509bad-279a-4da8-9f77-06123708e579-kube-api-access-4n25p\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jfldg\" (UID: \"4e509bad-279a-4da8-9f77-06123708e579\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jfldg" Dec 01 20:27:26 crc kubenswrapper[4802]: I1201 20:27:26.085833 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e509bad-279a-4da8-9f77-06123708e579-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jfldg\" (UID: \"4e509bad-279a-4da8-9f77-06123708e579\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jfldg" Dec 01 20:27:26 crc kubenswrapper[4802]: I1201 20:27:26.188376 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e509bad-279a-4da8-9f77-06123708e579-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jfldg\" (UID: \"4e509bad-279a-4da8-9f77-06123708e579\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jfldg" Dec 01 20:27:26 crc kubenswrapper[4802]: I1201 20:27:26.188452 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n25p\" (UniqueName: \"kubernetes.io/projected/4e509bad-279a-4da8-9f77-06123708e579-kube-api-access-4n25p\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jfldg\" (UID: \"4e509bad-279a-4da8-9f77-06123708e579\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jfldg" Dec 01 20:27:26 crc kubenswrapper[4802]: I1201 20:27:26.188553 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e509bad-279a-4da8-9f77-06123708e579-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jfldg\" (UID: \"4e509bad-279a-4da8-9f77-06123708e579\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jfldg" Dec 01 20:27:26 crc kubenswrapper[4802]: I1201 20:27:26.192878 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e509bad-279a-4da8-9f77-06123708e579-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jfldg\" (UID: \"4e509bad-279a-4da8-9f77-06123708e579\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jfldg" Dec 01 20:27:26 crc kubenswrapper[4802]: I1201 20:27:26.193420 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e509bad-279a-4da8-9f77-06123708e579-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jfldg\" (UID: \"4e509bad-279a-4da8-9f77-06123708e579\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jfldg" Dec 01 20:27:26 crc kubenswrapper[4802]: I1201 20:27:26.207038 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n25p\" (UniqueName: \"kubernetes.io/projected/4e509bad-279a-4da8-9f77-06123708e579-kube-api-access-4n25p\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jfldg\" (UID: \"4e509bad-279a-4da8-9f77-06123708e579\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jfldg" Dec 01 20:27:26 crc kubenswrapper[4802]: I1201 20:27:26.284693 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jfldg" Dec 01 20:27:26 crc kubenswrapper[4802]: I1201 20:27:26.814350 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-jfldg"] Dec 01 20:27:26 crc kubenswrapper[4802]: I1201 20:27:26.882842 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jfldg" event={"ID":"4e509bad-279a-4da8-9f77-06123708e579","Type":"ContainerStarted","Data":"9c32d6faffb0de54956d98d8c601a74ff6c259a2ab3fca3a470bbfc23901df26"} Dec 01 20:27:27 crc kubenswrapper[4802]: I1201 20:27:27.026442 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-v6674"] Dec 01 20:27:27 crc kubenswrapper[4802]: I1201 20:27:27.037146 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-v6674"] Dec 01 20:27:27 crc kubenswrapper[4802]: I1201 20:27:27.891100 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jfldg" event={"ID":"4e509bad-279a-4da8-9f77-06123708e579","Type":"ContainerStarted","Data":"1fcb25fc29970b4c0d3998d519cf89e5f05646e4403fe83c821708fab03bfb6b"} Dec 01 20:27:27 crc kubenswrapper[4802]: I1201 20:27:27.910227 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jfldg" podStartSLOduration=2.067709999 podStartE2EDuration="2.910188315s" podCreationTimestamp="2025-12-01 20:27:25 +0000 UTC" firstStartedPulling="2025-12-01 20:27:26.819795772 +0000 UTC m=+1868.382355413" lastFinishedPulling="2025-12-01 20:27:27.662274088 +0000 UTC m=+1869.224833729" observedRunningTime="2025-12-01 20:27:27.908388828 +0000 UTC m=+1869.470948489" watchObservedRunningTime="2025-12-01 20:27:27.910188315 +0000 UTC m=+1869.472747956" Dec 01 20:27:28 crc kubenswrapper[4802]: I1201 20:27:28.731093 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8852056c-2330-4f2b-acd3-89442f05e8c9" path="/var/lib/kubelet/pods/8852056c-2330-4f2b-acd3-89442f05e8c9/volumes" Dec 01 20:28:06 crc kubenswrapper[4802]: I1201 20:28:06.046304 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xpvd9"] Dec 01 20:28:06 crc kubenswrapper[4802]: I1201 20:28:06.055216 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xpvd9"] Dec 01 20:28:06 crc kubenswrapper[4802]: I1201 20:28:06.733261 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fea3a37-ff10-43ff-ace6-f79041e5617f" path="/var/lib/kubelet/pods/7fea3a37-ff10-43ff-ace6-f79041e5617f/volumes" Dec 01 20:28:08 crc kubenswrapper[4802]: I1201 20:28:08.274855 4802 generic.go:334] "Generic (PLEG): container finished" podID="4e509bad-279a-4da8-9f77-06123708e579" containerID="1fcb25fc29970b4c0d3998d519cf89e5f05646e4403fe83c821708fab03bfb6b" exitCode=0 Dec 01 20:28:08 crc kubenswrapper[4802]: I1201 20:28:08.274944 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jfldg" event={"ID":"4e509bad-279a-4da8-9f77-06123708e579","Type":"ContainerDied","Data":"1fcb25fc29970b4c0d3998d519cf89e5f05646e4403fe83c821708fab03bfb6b"} Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:09.799548 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jfldg" Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:09.911797 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n25p\" (UniqueName: \"kubernetes.io/projected/4e509bad-279a-4da8-9f77-06123708e579-kube-api-access-4n25p\") pod \"4e509bad-279a-4da8-9f77-06123708e579\" (UID: \"4e509bad-279a-4da8-9f77-06123708e579\") " Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:09.912236 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e509bad-279a-4da8-9f77-06123708e579-ssh-key\") pod \"4e509bad-279a-4da8-9f77-06123708e579\" (UID: \"4e509bad-279a-4da8-9f77-06123708e579\") " Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:09.912549 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e509bad-279a-4da8-9f77-06123708e579-inventory\") pod \"4e509bad-279a-4da8-9f77-06123708e579\" (UID: \"4e509bad-279a-4da8-9f77-06123708e579\") " Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:09.922533 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e509bad-279a-4da8-9f77-06123708e579-kube-api-access-4n25p" (OuterVolumeSpecName: "kube-api-access-4n25p") pod "4e509bad-279a-4da8-9f77-06123708e579" (UID: "4e509bad-279a-4da8-9f77-06123708e579"). InnerVolumeSpecName "kube-api-access-4n25p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:09.945974 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e509bad-279a-4da8-9f77-06123708e579-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4e509bad-279a-4da8-9f77-06123708e579" (UID: "4e509bad-279a-4da8-9f77-06123708e579"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:09.965462 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e509bad-279a-4da8-9f77-06123708e579-inventory" (OuterVolumeSpecName: "inventory") pod "4e509bad-279a-4da8-9f77-06123708e579" (UID: "4e509bad-279a-4da8-9f77-06123708e579"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:10.019638 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n25p\" (UniqueName: \"kubernetes.io/projected/4e509bad-279a-4da8-9f77-06123708e579-kube-api-access-4n25p\") on node \"crc\" DevicePath \"\"" Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:10.019667 4802 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e509bad-279a-4da8-9f77-06123708e579-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:10.019677 4802 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e509bad-279a-4da8-9f77-06123708e579-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:10.298166 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jfldg" event={"ID":"4e509bad-279a-4da8-9f77-06123708e579","Type":"ContainerDied","Data":"9c32d6faffb0de54956d98d8c601a74ff6c259a2ab3fca3a470bbfc23901df26"} Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:10.298246 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c32d6faffb0de54956d98d8c601a74ff6c259a2ab3fca3a470bbfc23901df26" Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:10.298252 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jfldg" Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:10.408697 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk"] Dec 01 20:28:10 crc kubenswrapper[4802]: E1201 20:28:10.409496 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e509bad-279a-4da8-9f77-06123708e579" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:10.409526 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e509bad-279a-4da8-9f77-06123708e579" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:10.409833 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e509bad-279a-4da8-9f77-06123708e579" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:10.412008 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk" Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:10.415881 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:10.417969 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:10.420387 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wx8v9" Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:10.420605 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:10.424298 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk"] Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:10.435139 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d276417-7411-4240-8260-5a976730047e-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk\" (UID: \"5d276417-7411-4240-8260-5a976730047e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk" Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:10.435232 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d276417-7411-4240-8260-5a976730047e-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk\" (UID: \"5d276417-7411-4240-8260-5a976730047e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk" Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:10.435941 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tv5g\" (UniqueName: \"kubernetes.io/projected/5d276417-7411-4240-8260-5a976730047e-kube-api-access-5tv5g\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk\" (UID: \"5d276417-7411-4240-8260-5a976730047e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk" Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:10.538989 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d276417-7411-4240-8260-5a976730047e-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk\" (UID: \"5d276417-7411-4240-8260-5a976730047e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk" Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:10.539122 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d276417-7411-4240-8260-5a976730047e-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk\" (UID: \"5d276417-7411-4240-8260-5a976730047e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk" Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:10.539329 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tv5g\" (UniqueName: \"kubernetes.io/projected/5d276417-7411-4240-8260-5a976730047e-kube-api-access-5tv5g\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk\" (UID: \"5d276417-7411-4240-8260-5a976730047e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk" Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:10.543439 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d276417-7411-4240-8260-5a976730047e-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk\" (UID: \"5d276417-7411-4240-8260-5a976730047e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk" Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:10.547098 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d276417-7411-4240-8260-5a976730047e-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk\" (UID: \"5d276417-7411-4240-8260-5a976730047e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk" Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:10.559073 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tv5g\" (UniqueName: \"kubernetes.io/projected/5d276417-7411-4240-8260-5a976730047e-kube-api-access-5tv5g\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk\" (UID: \"5d276417-7411-4240-8260-5a976730047e\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk" Dec 01 20:28:10 crc kubenswrapper[4802]: I1201 20:28:10.737590 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk" Dec 01 20:28:11 crc kubenswrapper[4802]: I1201 20:28:11.398976 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk"] Dec 01 20:28:11 crc kubenswrapper[4802]: W1201 20:28:11.407557 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d276417_7411_4240_8260_5a976730047e.slice/crio-73681002f4e03f291a4179a2fbf22a7402fce5527d047a622af8404b475378ef WatchSource:0}: Error finding container 73681002f4e03f291a4179a2fbf22a7402fce5527d047a622af8404b475378ef: Status 404 returned error can't find the container with id 73681002f4e03f291a4179a2fbf22a7402fce5527d047a622af8404b475378ef Dec 01 20:28:12 crc kubenswrapper[4802]: I1201 20:28:12.317810 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk" event={"ID":"5d276417-7411-4240-8260-5a976730047e","Type":"ContainerStarted","Data":"73681002f4e03f291a4179a2fbf22a7402fce5527d047a622af8404b475378ef"} Dec 01 20:28:13 crc kubenswrapper[4802]: I1201 20:28:13.029842 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-xkpk5"] Dec 01 20:28:13 crc kubenswrapper[4802]: I1201 20:28:13.038910 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-xkpk5"] Dec 01 20:28:13 crc kubenswrapper[4802]: I1201 20:28:13.333099 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk" event={"ID":"5d276417-7411-4240-8260-5a976730047e","Type":"ContainerStarted","Data":"31d5ccfcebe6d7948a392f0c8d4d3481f47c32e1c8d108915b2c0c1c3856e5a4"} Dec 01 20:28:13 crc kubenswrapper[4802]: I1201 20:28:13.360673 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk" podStartSLOduration=2.745806227 podStartE2EDuration="3.36062793s" podCreationTimestamp="2025-12-01 20:28:10 +0000 UTC" firstStartedPulling="2025-12-01 20:28:11.411028438 +0000 UTC m=+1912.973588119" lastFinishedPulling="2025-12-01 20:28:12.025850151 +0000 UTC m=+1913.588409822" observedRunningTime="2025-12-01 20:28:13.355441918 +0000 UTC m=+1914.918001639" watchObservedRunningTime="2025-12-01 20:28:13.36062793 +0000 UTC m=+1914.923187611" Dec 01 20:28:14 crc kubenswrapper[4802]: I1201 20:28:14.741693 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b1801b5-50f2-40fc-9e14-386216a4418c" path="/var/lib/kubelet/pods/8b1801b5-50f2-40fc-9e14-386216a4418c/volumes" Dec 01 20:28:15 crc kubenswrapper[4802]: I1201 20:28:15.031902 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-xcrnm"] Dec 01 20:28:15 crc kubenswrapper[4802]: I1201 20:28:15.039882 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-xcrnm"] Dec 01 20:28:16 crc kubenswrapper[4802]: I1201 20:28:16.734571 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f0b1428-3468-4e47-939d-8614d302bd75" path="/var/lib/kubelet/pods/1f0b1428-3468-4e47-939d-8614d302bd75/volumes" Dec 01 20:28:17 crc kubenswrapper[4802]: I1201 20:28:17.376453 4802 generic.go:334] "Generic (PLEG): container finished" podID="5d276417-7411-4240-8260-5a976730047e" containerID="31d5ccfcebe6d7948a392f0c8d4d3481f47c32e1c8d108915b2c0c1c3856e5a4" exitCode=0 Dec 01 20:28:17 crc kubenswrapper[4802]: I1201 20:28:17.376655 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk" event={"ID":"5d276417-7411-4240-8260-5a976730047e","Type":"ContainerDied","Data":"31d5ccfcebe6d7948a392f0c8d4d3481f47c32e1c8d108915b2c0c1c3856e5a4"} Dec 01 20:28:18 crc kubenswrapper[4802]: I1201 20:28:18.831045 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk" Dec 01 20:28:18 crc kubenswrapper[4802]: I1201 20:28:18.949716 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tv5g\" (UniqueName: \"kubernetes.io/projected/5d276417-7411-4240-8260-5a976730047e-kube-api-access-5tv5g\") pod \"5d276417-7411-4240-8260-5a976730047e\" (UID: \"5d276417-7411-4240-8260-5a976730047e\") " Dec 01 20:28:18 crc kubenswrapper[4802]: I1201 20:28:18.949789 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d276417-7411-4240-8260-5a976730047e-inventory\") pod \"5d276417-7411-4240-8260-5a976730047e\" (UID: \"5d276417-7411-4240-8260-5a976730047e\") " Dec 01 20:28:18 crc kubenswrapper[4802]: I1201 20:28:18.949857 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d276417-7411-4240-8260-5a976730047e-ssh-key\") pod \"5d276417-7411-4240-8260-5a976730047e\" (UID: \"5d276417-7411-4240-8260-5a976730047e\") " Dec 01 20:28:18 crc kubenswrapper[4802]: I1201 20:28:18.955910 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d276417-7411-4240-8260-5a976730047e-kube-api-access-5tv5g" (OuterVolumeSpecName: "kube-api-access-5tv5g") pod "5d276417-7411-4240-8260-5a976730047e" (UID: "5d276417-7411-4240-8260-5a976730047e"). InnerVolumeSpecName "kube-api-access-5tv5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:28:18 crc kubenswrapper[4802]: I1201 20:28:18.980992 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d276417-7411-4240-8260-5a976730047e-inventory" (OuterVolumeSpecName: "inventory") pod "5d276417-7411-4240-8260-5a976730047e" (UID: "5d276417-7411-4240-8260-5a976730047e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:28:18 crc kubenswrapper[4802]: I1201 20:28:18.983060 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d276417-7411-4240-8260-5a976730047e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5d276417-7411-4240-8260-5a976730047e" (UID: "5d276417-7411-4240-8260-5a976730047e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:28:19 crc kubenswrapper[4802]: I1201 20:28:19.053255 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tv5g\" (UniqueName: \"kubernetes.io/projected/5d276417-7411-4240-8260-5a976730047e-kube-api-access-5tv5g\") on node \"crc\" DevicePath \"\"" Dec 01 20:28:19 crc kubenswrapper[4802]: I1201 20:28:19.053515 4802 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d276417-7411-4240-8260-5a976730047e-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 20:28:19 crc kubenswrapper[4802]: I1201 20:28:19.053595 4802 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5d276417-7411-4240-8260-5a976730047e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 20:28:19 crc kubenswrapper[4802]: I1201 20:28:19.397619 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk" event={"ID":"5d276417-7411-4240-8260-5a976730047e","Type":"ContainerDied","Data":"73681002f4e03f291a4179a2fbf22a7402fce5527d047a622af8404b475378ef"} Dec 01 20:28:19 crc kubenswrapper[4802]: I1201 20:28:19.397871 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73681002f4e03f291a4179a2fbf22a7402fce5527d047a622af8404b475378ef" Dec 01 20:28:19 crc kubenswrapper[4802]: I1201 20:28:19.397734 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk" Dec 01 20:28:19 crc kubenswrapper[4802]: I1201 20:28:19.504860 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4str6"] Dec 01 20:28:19 crc kubenswrapper[4802]: E1201 20:28:19.505291 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d276417-7411-4240-8260-5a976730047e" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 01 20:28:19 crc kubenswrapper[4802]: I1201 20:28:19.505308 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d276417-7411-4240-8260-5a976730047e" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 01 20:28:19 crc kubenswrapper[4802]: I1201 20:28:19.505467 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d276417-7411-4240-8260-5a976730047e" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 01 20:28:19 crc kubenswrapper[4802]: I1201 20:28:19.506026 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4str6" Dec 01 20:28:19 crc kubenswrapper[4802]: I1201 20:28:19.508627 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 20:28:19 crc kubenswrapper[4802]: I1201 20:28:19.510804 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 20:28:19 crc kubenswrapper[4802]: I1201 20:28:19.512387 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 20:28:19 crc kubenswrapper[4802]: I1201 20:28:19.512570 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wx8v9" Dec 01 20:28:19 crc kubenswrapper[4802]: I1201 20:28:19.527767 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4str6"] Dec 01 20:28:19 crc kubenswrapper[4802]: I1201 20:28:19.665095 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06c88831-4819-4a0b-901c-4b00adfda2a4-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4str6\" (UID: \"06c88831-4819-4a0b-901c-4b00adfda2a4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4str6" Dec 01 20:28:19 crc kubenswrapper[4802]: I1201 20:28:19.665174 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06c88831-4819-4a0b-901c-4b00adfda2a4-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4str6\" (UID: \"06c88831-4819-4a0b-901c-4b00adfda2a4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4str6" Dec 01 20:28:19 crc kubenswrapper[4802]: I1201 20:28:19.665267 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvfdq\" (UniqueName: \"kubernetes.io/projected/06c88831-4819-4a0b-901c-4b00adfda2a4-kube-api-access-rvfdq\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4str6\" (UID: \"06c88831-4819-4a0b-901c-4b00adfda2a4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4str6" Dec 01 20:28:19 crc kubenswrapper[4802]: I1201 20:28:19.766629 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06c88831-4819-4a0b-901c-4b00adfda2a4-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4str6\" (UID: \"06c88831-4819-4a0b-901c-4b00adfda2a4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4str6" Dec 01 20:28:19 crc kubenswrapper[4802]: I1201 20:28:19.766783 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06c88831-4819-4a0b-901c-4b00adfda2a4-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4str6\" (UID: \"06c88831-4819-4a0b-901c-4b00adfda2a4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4str6" Dec 01 20:28:19 crc kubenswrapper[4802]: I1201 20:28:19.766922 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvfdq\" (UniqueName: \"kubernetes.io/projected/06c88831-4819-4a0b-901c-4b00adfda2a4-kube-api-access-rvfdq\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4str6\" (UID: \"06c88831-4819-4a0b-901c-4b00adfda2a4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4str6" Dec 01 20:28:19 crc kubenswrapper[4802]: I1201 20:28:19.774122 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06c88831-4819-4a0b-901c-4b00adfda2a4-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4str6\" (UID: \"06c88831-4819-4a0b-901c-4b00adfda2a4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4str6" Dec 01 20:28:19 crc kubenswrapper[4802]: I1201 20:28:19.775165 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06c88831-4819-4a0b-901c-4b00adfda2a4-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4str6\" (UID: \"06c88831-4819-4a0b-901c-4b00adfda2a4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4str6" Dec 01 20:28:19 crc kubenswrapper[4802]: I1201 20:28:19.788990 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvfdq\" (UniqueName: \"kubernetes.io/projected/06c88831-4819-4a0b-901c-4b00adfda2a4-kube-api-access-rvfdq\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4str6\" (UID: \"06c88831-4819-4a0b-901c-4b00adfda2a4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4str6" Dec 01 20:28:19 crc kubenswrapper[4802]: I1201 20:28:19.829361 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4str6" Dec 01 20:28:20 crc kubenswrapper[4802]: I1201 20:28:20.031726 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-hkl2c"] Dec 01 20:28:20 crc kubenswrapper[4802]: I1201 20:28:20.039927 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-hkl2c"] Dec 01 20:28:20 crc kubenswrapper[4802]: I1201 20:28:20.465529 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4str6"] Dec 01 20:28:20 crc kubenswrapper[4802]: W1201 20:28:20.466142 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06c88831_4819_4a0b_901c_4b00adfda2a4.slice/crio-64c9a093f4bb9ab08409ad709e946bb1e7104438ef77236be20901e14f299c99 WatchSource:0}: Error finding container 64c9a093f4bb9ab08409ad709e946bb1e7104438ef77236be20901e14f299c99: Status 404 returned error can't find the container with id 64c9a093f4bb9ab08409ad709e946bb1e7104438ef77236be20901e14f299c99 Dec 01 20:28:20 crc kubenswrapper[4802]: I1201 20:28:20.740808 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d086846-e371-492b-81c2-0fb443b14f30" path="/var/lib/kubelet/pods/4d086846-e371-492b-81c2-0fb443b14f30/volumes" Dec 01 20:28:21 crc kubenswrapper[4802]: I1201 20:28:21.422673 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4str6" event={"ID":"06c88831-4819-4a0b-901c-4b00adfda2a4","Type":"ContainerStarted","Data":"64c9a093f4bb9ab08409ad709e946bb1e7104438ef77236be20901e14f299c99"} Dec 01 20:28:22 crc kubenswrapper[4802]: I1201 20:28:22.459934 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4str6" event={"ID":"06c88831-4819-4a0b-901c-4b00adfda2a4","Type":"ContainerStarted","Data":"b6f49da08bb316904fe562afe37558520b5fd287779804dca36b434fed9a5db5"} Dec 01 20:28:22 crc kubenswrapper[4802]: I1201 20:28:22.493986 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4str6" podStartSLOduration=2.683860769 podStartE2EDuration="3.493953964s" podCreationTimestamp="2025-12-01 20:28:19 +0000 UTC" firstStartedPulling="2025-12-01 20:28:20.46844907 +0000 UTC m=+1922.031008711" lastFinishedPulling="2025-12-01 20:28:21.278542255 +0000 UTC m=+1922.841101906" observedRunningTime="2025-12-01 20:28:22.479471472 +0000 UTC m=+1924.042031113" watchObservedRunningTime="2025-12-01 20:28:22.493953964 +0000 UTC m=+1924.056513635" Dec 01 20:28:24 crc kubenswrapper[4802]: I1201 20:28:24.590161 4802 scope.go:117] "RemoveContainer" containerID="e96c6a0abb2e46394691b4dddbe5bf7e7d26106c1a695582de0e7c39ea9c73f2" Dec 01 20:28:24 crc kubenswrapper[4802]: I1201 20:28:24.627372 4802 scope.go:117] "RemoveContainer" containerID="b960e6fc96accaaf52672c959daf9ccdcd9204d9deb6a549e9947c5e12659930" Dec 01 20:28:24 crc kubenswrapper[4802]: I1201 20:28:24.673276 4802 scope.go:117] "RemoveContainer" containerID="4c2b475b9b0a9df89ec774654ac31deeee6d7364aaa82df14e1cfbcbc0456ff5" Dec 01 20:28:24 crc kubenswrapper[4802]: I1201 20:28:24.733775 4802 scope.go:117] "RemoveContainer" containerID="cc9037216683e5e4d67c82eb3b1440096e90685b5a00073794fd433b4b23a47c" Dec 01 20:28:24 crc kubenswrapper[4802]: I1201 20:28:24.795622 4802 scope.go:117] "RemoveContainer" containerID="6d42a16ae8e21143fd39acba472b9547ba784f977d607107345fafa541dd575a" Dec 01 20:28:24 crc kubenswrapper[4802]: I1201 20:28:24.867531 4802 scope.go:117] "RemoveContainer" containerID="a4ad1aef6c49ef0c311a9993a4f30b0878106ef7dd57c60eb5a085e2caf3edba" Dec 01 20:28:24 crc kubenswrapper[4802]: I1201 20:28:24.917606 4802 scope.go:117] "RemoveContainer" containerID="c38b6d4c5b810d41a5b58aa0bbb34e2c7cb04c58073e46a32b8328e270092bd6" Dec 01 20:28:28 crc kubenswrapper[4802]: I1201 20:28:28.041250 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-p259k"] Dec 01 20:28:28 crc kubenswrapper[4802]: I1201 20:28:28.051982 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-p259k"] Dec 01 20:28:28 crc kubenswrapper[4802]: I1201 20:28:28.738337 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93f6c930-0ed7-480e-8725-692427ba2b9d" path="/var/lib/kubelet/pods/93f6c930-0ed7-480e-8725-692427ba2b9d/volumes" Dec 01 20:28:59 crc kubenswrapper[4802]: I1201 20:28:59.066665 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-ee5c-account-create-update-xw4zq"] Dec 01 20:28:59 crc kubenswrapper[4802]: I1201 20:28:59.084273 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-qplvx"] Dec 01 20:28:59 crc kubenswrapper[4802]: I1201 20:28:59.102858 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-qplvx"] Dec 01 20:28:59 crc kubenswrapper[4802]: I1201 20:28:59.113792 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-ee5c-account-create-update-xw4zq"] Dec 01 20:29:00 crc kubenswrapper[4802]: I1201 20:29:00.040541 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-aa02-account-create-update-tf6q4"] Dec 01 20:29:00 crc kubenswrapper[4802]: I1201 20:29:00.055958 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-3c20-account-create-update-frnjt"] Dec 01 20:29:00 crc kubenswrapper[4802]: I1201 20:29:00.065110 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-2kqwl"] Dec 01 20:29:00 crc kubenswrapper[4802]: I1201 20:29:00.073600 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-3c20-account-create-update-frnjt"] Dec 01 20:29:00 crc kubenswrapper[4802]: I1201 20:29:00.079644 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-xf6km"] Dec 01 20:29:00 crc kubenswrapper[4802]: I1201 20:29:00.085535 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-aa02-account-create-update-tf6q4"] Dec 01 20:29:00 crc kubenswrapper[4802]: I1201 20:29:00.092650 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-2kqwl"] Dec 01 20:29:00 crc kubenswrapper[4802]: I1201 20:29:00.103219 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-xf6km"] Dec 01 20:29:00 crc kubenswrapper[4802]: I1201 20:29:00.741402 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53cf7526-8d6a-4789-8582-854087ec7b2b" path="/var/lib/kubelet/pods/53cf7526-8d6a-4789-8582-854087ec7b2b/volumes" Dec 01 20:29:00 crc kubenswrapper[4802]: I1201 20:29:00.742820 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a13d5208-da5b-4c02-84e5-871547bbafbf" path="/var/lib/kubelet/pods/a13d5208-da5b-4c02-84e5-871547bbafbf/volumes" Dec 01 20:29:00 crc kubenswrapper[4802]: I1201 20:29:00.744254 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b26bcb68-81a2-43fa-b81f-29dbc8ca213e" path="/var/lib/kubelet/pods/b26bcb68-81a2-43fa-b81f-29dbc8ca213e/volumes" Dec 01 20:29:00 crc kubenswrapper[4802]: I1201 20:29:00.746082 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e" path="/var/lib/kubelet/pods/c272ab07-d57f-48d2-a1a6-e3e9ed4cab0e/volumes" Dec 01 20:29:00 crc kubenswrapper[4802]: I1201 20:29:00.748370 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efc9020e-4f69-49c4-9788-aa84f98d8c77" path="/var/lib/kubelet/pods/efc9020e-4f69-49c4-9788-aa84f98d8c77/volumes" Dec 01 20:29:00 crc kubenswrapper[4802]: I1201 20:29:00.749569 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc3b62ea-1f25-4221-b09f-5a915164fa80" path="/var/lib/kubelet/pods/fc3b62ea-1f25-4221-b09f-5a915164fa80/volumes" Dec 01 20:29:22 crc kubenswrapper[4802]: I1201 20:29:22.167678 4802 generic.go:334] "Generic (PLEG): container finished" podID="06c88831-4819-4a0b-901c-4b00adfda2a4" containerID="b6f49da08bb316904fe562afe37558520b5fd287779804dca36b434fed9a5db5" exitCode=0 Dec 01 20:29:22 crc kubenswrapper[4802]: I1201 20:29:22.167762 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4str6" event={"ID":"06c88831-4819-4a0b-901c-4b00adfda2a4","Type":"ContainerDied","Data":"b6f49da08bb316904fe562afe37558520b5fd287779804dca36b434fed9a5db5"} Dec 01 20:29:23 crc kubenswrapper[4802]: I1201 20:29:23.716847 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4str6" Dec 01 20:29:23 crc kubenswrapper[4802]: I1201 20:29:23.890363 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06c88831-4819-4a0b-901c-4b00adfda2a4-ssh-key\") pod \"06c88831-4819-4a0b-901c-4b00adfda2a4\" (UID: \"06c88831-4819-4a0b-901c-4b00adfda2a4\") " Dec 01 20:29:23 crc kubenswrapper[4802]: I1201 20:29:23.891399 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06c88831-4819-4a0b-901c-4b00adfda2a4-inventory\") pod \"06c88831-4819-4a0b-901c-4b00adfda2a4\" (UID: \"06c88831-4819-4a0b-901c-4b00adfda2a4\") " Dec 01 20:29:23 crc kubenswrapper[4802]: I1201 20:29:23.891646 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvfdq\" (UniqueName: \"kubernetes.io/projected/06c88831-4819-4a0b-901c-4b00adfda2a4-kube-api-access-rvfdq\") pod \"06c88831-4819-4a0b-901c-4b00adfda2a4\" (UID: \"06c88831-4819-4a0b-901c-4b00adfda2a4\") " Dec 01 20:29:23 crc kubenswrapper[4802]: I1201 20:29:23.898834 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06c88831-4819-4a0b-901c-4b00adfda2a4-kube-api-access-rvfdq" (OuterVolumeSpecName: "kube-api-access-rvfdq") pod "06c88831-4819-4a0b-901c-4b00adfda2a4" (UID: "06c88831-4819-4a0b-901c-4b00adfda2a4"). InnerVolumeSpecName "kube-api-access-rvfdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:29:23 crc kubenswrapper[4802]: I1201 20:29:23.927948 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06c88831-4819-4a0b-901c-4b00adfda2a4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "06c88831-4819-4a0b-901c-4b00adfda2a4" (UID: "06c88831-4819-4a0b-901c-4b00adfda2a4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:29:23 crc kubenswrapper[4802]: I1201 20:29:23.946559 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06c88831-4819-4a0b-901c-4b00adfda2a4-inventory" (OuterVolumeSpecName: "inventory") pod "06c88831-4819-4a0b-901c-4b00adfda2a4" (UID: "06c88831-4819-4a0b-901c-4b00adfda2a4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:29:23 crc kubenswrapper[4802]: I1201 20:29:23.996308 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvfdq\" (UniqueName: \"kubernetes.io/projected/06c88831-4819-4a0b-901c-4b00adfda2a4-kube-api-access-rvfdq\") on node \"crc\" DevicePath \"\"" Dec 01 20:29:23 crc kubenswrapper[4802]: I1201 20:29:23.996339 4802 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06c88831-4819-4a0b-901c-4b00adfda2a4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 20:29:23 crc kubenswrapper[4802]: I1201 20:29:23.996350 4802 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06c88831-4819-4a0b-901c-4b00adfda2a4-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 20:29:24 crc kubenswrapper[4802]: I1201 20:29:24.195906 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4str6" event={"ID":"06c88831-4819-4a0b-901c-4b00adfda2a4","Type":"ContainerDied","Data":"64c9a093f4bb9ab08409ad709e946bb1e7104438ef77236be20901e14f299c99"} Dec 01 20:29:24 crc kubenswrapper[4802]: I1201 20:29:24.196317 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64c9a093f4bb9ab08409ad709e946bb1e7104438ef77236be20901e14f299c99" Dec 01 20:29:24 crc kubenswrapper[4802]: I1201 20:29:24.196009 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4str6" Dec 01 20:29:24 crc kubenswrapper[4802]: I1201 20:29:24.308796 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-8wwpr"] Dec 01 20:29:24 crc kubenswrapper[4802]: E1201 20:29:24.309529 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06c88831-4819-4a0b-901c-4b00adfda2a4" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 20:29:24 crc kubenswrapper[4802]: I1201 20:29:24.309559 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="06c88831-4819-4a0b-901c-4b00adfda2a4" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 20:29:24 crc kubenswrapper[4802]: I1201 20:29:24.309896 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="06c88831-4819-4a0b-901c-4b00adfda2a4" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 20:29:24 crc kubenswrapper[4802]: I1201 20:29:24.310988 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8wwpr" Dec 01 20:29:24 crc kubenswrapper[4802]: I1201 20:29:24.313996 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wx8v9" Dec 01 20:29:24 crc kubenswrapper[4802]: I1201 20:29:24.314178 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 20:29:24 crc kubenswrapper[4802]: I1201 20:29:24.314888 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 20:29:24 crc kubenswrapper[4802]: I1201 20:29:24.315016 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 20:29:24 crc kubenswrapper[4802]: I1201 20:29:24.319631 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-8wwpr"] Dec 01 20:29:24 crc kubenswrapper[4802]: I1201 20:29:24.405112 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c587bf8-2825-488a-8a22-d224dfdd6073-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-8wwpr\" (UID: \"2c587bf8-2825-488a-8a22-d224dfdd6073\") " pod="openstack/ssh-known-hosts-edpm-deployment-8wwpr" Dec 01 20:29:24 crc kubenswrapper[4802]: I1201 20:29:24.405172 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2c587bf8-2825-488a-8a22-d224dfdd6073-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-8wwpr\" (UID: \"2c587bf8-2825-488a-8a22-d224dfdd6073\") " pod="openstack/ssh-known-hosts-edpm-deployment-8wwpr" Dec 01 20:29:24 crc kubenswrapper[4802]: I1201 20:29:24.405225 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7254m\" (UniqueName: \"kubernetes.io/projected/2c587bf8-2825-488a-8a22-d224dfdd6073-kube-api-access-7254m\") pod \"ssh-known-hosts-edpm-deployment-8wwpr\" (UID: \"2c587bf8-2825-488a-8a22-d224dfdd6073\") " pod="openstack/ssh-known-hosts-edpm-deployment-8wwpr" Dec 01 20:29:24 crc kubenswrapper[4802]: I1201 20:29:24.506644 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2c587bf8-2825-488a-8a22-d224dfdd6073-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-8wwpr\" (UID: \"2c587bf8-2825-488a-8a22-d224dfdd6073\") " pod="openstack/ssh-known-hosts-edpm-deployment-8wwpr" Dec 01 20:29:24 crc kubenswrapper[4802]: I1201 20:29:24.506689 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7254m\" (UniqueName: \"kubernetes.io/projected/2c587bf8-2825-488a-8a22-d224dfdd6073-kube-api-access-7254m\") pod \"ssh-known-hosts-edpm-deployment-8wwpr\" (UID: \"2c587bf8-2825-488a-8a22-d224dfdd6073\") " pod="openstack/ssh-known-hosts-edpm-deployment-8wwpr" Dec 01 20:29:24 crc kubenswrapper[4802]: I1201 20:29:24.506843 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c587bf8-2825-488a-8a22-d224dfdd6073-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-8wwpr\" (UID: \"2c587bf8-2825-488a-8a22-d224dfdd6073\") " pod="openstack/ssh-known-hosts-edpm-deployment-8wwpr" Dec 01 20:29:24 crc kubenswrapper[4802]: I1201 20:29:24.512954 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c587bf8-2825-488a-8a22-d224dfdd6073-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-8wwpr\" (UID: \"2c587bf8-2825-488a-8a22-d224dfdd6073\") " pod="openstack/ssh-known-hosts-edpm-deployment-8wwpr" Dec 01 20:29:24 crc kubenswrapper[4802]: I1201 20:29:24.513066 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2c587bf8-2825-488a-8a22-d224dfdd6073-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-8wwpr\" (UID: \"2c587bf8-2825-488a-8a22-d224dfdd6073\") " pod="openstack/ssh-known-hosts-edpm-deployment-8wwpr" Dec 01 20:29:24 crc kubenswrapper[4802]: I1201 20:29:24.532265 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7254m\" (UniqueName: \"kubernetes.io/projected/2c587bf8-2825-488a-8a22-d224dfdd6073-kube-api-access-7254m\") pod \"ssh-known-hosts-edpm-deployment-8wwpr\" (UID: \"2c587bf8-2825-488a-8a22-d224dfdd6073\") " pod="openstack/ssh-known-hosts-edpm-deployment-8wwpr" Dec 01 20:29:24 crc kubenswrapper[4802]: I1201 20:29:24.638907 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8wwpr" Dec 01 20:29:25 crc kubenswrapper[4802]: I1201 20:29:25.067245 4802 scope.go:117] "RemoveContainer" containerID="6bea7ec021452b58fc7fec289da0c417cb45f371401c8b3cf7ed26ab7e516354" Dec 01 20:29:25 crc kubenswrapper[4802]: I1201 20:29:25.104502 4802 scope.go:117] "RemoveContainer" containerID="9a72b4640dc57e0f397acdcae94effa644ef7c6f13439da636080dfd29401d8b" Dec 01 20:29:25 crc kubenswrapper[4802]: I1201 20:29:25.166784 4802 scope.go:117] "RemoveContainer" containerID="989dd14f44779cadab7b021d956f743850af71d135a9136601e559fbba569b63" Dec 01 20:29:25 crc kubenswrapper[4802]: I1201 20:29:25.224059 4802 scope.go:117] "RemoveContainer" containerID="746c1a240b5c5c18cc9f61fb2c37fbff5380dc9cad4c85ea9cc09ab494a06a5d" Dec 01 20:29:25 crc kubenswrapper[4802]: W1201 20:29:25.230489 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c587bf8_2825_488a_8a22_d224dfdd6073.slice/crio-87a5193de6c7ddf171985b623249913d49925eee6d593d7eaed0064fe1b4f33c WatchSource:0}: Error finding container 87a5193de6c7ddf171985b623249913d49925eee6d593d7eaed0064fe1b4f33c: Status 404 returned error can't find the container with id 87a5193de6c7ddf171985b623249913d49925eee6d593d7eaed0064fe1b4f33c Dec 01 20:29:25 crc kubenswrapper[4802]: I1201 20:29:25.233813 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-8wwpr"] Dec 01 20:29:25 crc kubenswrapper[4802]: I1201 20:29:25.276155 4802 scope.go:117] "RemoveContainer" containerID="4f2dbefc833f7007d51200a196e4c47c2805ddacb4d255b9f0013ee4abd07172" Dec 01 20:29:25 crc kubenswrapper[4802]: I1201 20:29:25.303810 4802 scope.go:117] "RemoveContainer" containerID="9e7d7b9c7371b813189ed90f7f50dd2604410c6435ea5d2c30fc4a0a5501a80d" Dec 01 20:29:25 crc kubenswrapper[4802]: I1201 20:29:25.332066 4802 scope.go:117] "RemoveContainer" containerID="40026d4792378d5240c3f320f5f8d901bda39fbae51edd49b253b8675229f89c" Dec 01 20:29:26 crc kubenswrapper[4802]: I1201 20:29:26.043705 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-r7wmv"] Dec 01 20:29:26 crc kubenswrapper[4802]: I1201 20:29:26.055397 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-r7wmv"] Dec 01 20:29:26 crc kubenswrapper[4802]: I1201 20:29:26.229673 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8wwpr" event={"ID":"2c587bf8-2825-488a-8a22-d224dfdd6073","Type":"ContainerStarted","Data":"ad4d84e14b1fc7d09b7a8f0a0a47e21569efc3fd5b67c97d7df28751efef04b1"} Dec 01 20:29:26 crc kubenswrapper[4802]: I1201 20:29:26.229727 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8wwpr" event={"ID":"2c587bf8-2825-488a-8a22-d224dfdd6073","Type":"ContainerStarted","Data":"87a5193de6c7ddf171985b623249913d49925eee6d593d7eaed0064fe1b4f33c"} Dec 01 20:29:26 crc kubenswrapper[4802]: I1201 20:29:26.252108 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-8wwpr" podStartSLOduration=1.677448682 podStartE2EDuration="2.250075286s" podCreationTimestamp="2025-12-01 20:29:24 +0000 UTC" firstStartedPulling="2025-12-01 20:29:25.259693908 +0000 UTC m=+1986.822253589" lastFinishedPulling="2025-12-01 20:29:25.832320532 +0000 UTC m=+1987.394880193" observedRunningTime="2025-12-01 20:29:26.249512049 +0000 UTC m=+1987.812071730" watchObservedRunningTime="2025-12-01 20:29:26.250075286 +0000 UTC m=+1987.812634937" Dec 01 20:29:26 crc kubenswrapper[4802]: I1201 20:29:26.733572 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc32fb72-22ab-49e4-89b3-bb5aaf4c3456" path="/var/lib/kubelet/pods/fc32fb72-22ab-49e4-89b3-bb5aaf4c3456/volumes" Dec 01 20:29:28 crc kubenswrapper[4802]: I1201 20:29:28.088937 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:29:28 crc kubenswrapper[4802]: I1201 20:29:28.089746 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:29:34 crc kubenswrapper[4802]: I1201 20:29:34.342157 4802 generic.go:334] "Generic (PLEG): container finished" podID="2c587bf8-2825-488a-8a22-d224dfdd6073" containerID="ad4d84e14b1fc7d09b7a8f0a0a47e21569efc3fd5b67c97d7df28751efef04b1" exitCode=0 Dec 01 20:29:34 crc kubenswrapper[4802]: I1201 20:29:34.342243 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8wwpr" event={"ID":"2c587bf8-2825-488a-8a22-d224dfdd6073","Type":"ContainerDied","Data":"ad4d84e14b1fc7d09b7a8f0a0a47e21569efc3fd5b67c97d7df28751efef04b1"} Dec 01 20:29:35 crc kubenswrapper[4802]: I1201 20:29:35.828703 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8wwpr" Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.015715 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2c587bf8-2825-488a-8a22-d224dfdd6073-inventory-0\") pod \"2c587bf8-2825-488a-8a22-d224dfdd6073\" (UID: \"2c587bf8-2825-488a-8a22-d224dfdd6073\") " Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.015813 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c587bf8-2825-488a-8a22-d224dfdd6073-ssh-key-openstack-edpm-ipam\") pod \"2c587bf8-2825-488a-8a22-d224dfdd6073\" (UID: \"2c587bf8-2825-488a-8a22-d224dfdd6073\") " Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.015855 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7254m\" (UniqueName: \"kubernetes.io/projected/2c587bf8-2825-488a-8a22-d224dfdd6073-kube-api-access-7254m\") pod \"2c587bf8-2825-488a-8a22-d224dfdd6073\" (UID: \"2c587bf8-2825-488a-8a22-d224dfdd6073\") " Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.029469 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c587bf8-2825-488a-8a22-d224dfdd6073-kube-api-access-7254m" (OuterVolumeSpecName: "kube-api-access-7254m") pod "2c587bf8-2825-488a-8a22-d224dfdd6073" (UID: "2c587bf8-2825-488a-8a22-d224dfdd6073"). InnerVolumeSpecName "kube-api-access-7254m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.064858 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c587bf8-2825-488a-8a22-d224dfdd6073-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "2c587bf8-2825-488a-8a22-d224dfdd6073" (UID: "2c587bf8-2825-488a-8a22-d224dfdd6073"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.071943 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c587bf8-2825-488a-8a22-d224dfdd6073-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2c587bf8-2825-488a-8a22-d224dfdd6073" (UID: "2c587bf8-2825-488a-8a22-d224dfdd6073"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.117853 4802 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2c587bf8-2825-488a-8a22-d224dfdd6073-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.117907 4802 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c587bf8-2825-488a-8a22-d224dfdd6073-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.117928 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7254m\" (UniqueName: \"kubernetes.io/projected/2c587bf8-2825-488a-8a22-d224dfdd6073-kube-api-access-7254m\") on node \"crc\" DevicePath \"\"" Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.368332 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8wwpr" event={"ID":"2c587bf8-2825-488a-8a22-d224dfdd6073","Type":"ContainerDied","Data":"87a5193de6c7ddf171985b623249913d49925eee6d593d7eaed0064fe1b4f33c"} Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.368399 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87a5193de6c7ddf171985b623249913d49925eee6d593d7eaed0064fe1b4f33c" Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.368441 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8wwpr" Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.502496 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-wg72m"] Dec 01 20:29:36 crc kubenswrapper[4802]: E1201 20:29:36.503403 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c587bf8-2825-488a-8a22-d224dfdd6073" containerName="ssh-known-hosts-edpm-deployment" Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.503527 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c587bf8-2825-488a-8a22-d224dfdd6073" containerName="ssh-known-hosts-edpm-deployment" Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.503969 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c587bf8-2825-488a-8a22-d224dfdd6073" containerName="ssh-known-hosts-edpm-deployment" Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.507069 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wg72m" Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.509916 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.510022 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wx8v9" Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.511134 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-wg72m"] Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.511248 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.511409 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.554763 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n9nx\" (UniqueName: \"kubernetes.io/projected/49bf0a46-b184-40ac-8f37-4c8c38d1e455-kube-api-access-8n9nx\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wg72m\" (UID: \"49bf0a46-b184-40ac-8f37-4c8c38d1e455\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wg72m" Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.555139 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49bf0a46-b184-40ac-8f37-4c8c38d1e455-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wg72m\" (UID: \"49bf0a46-b184-40ac-8f37-4c8c38d1e455\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wg72m" Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.555333 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49bf0a46-b184-40ac-8f37-4c8c38d1e455-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wg72m\" (UID: \"49bf0a46-b184-40ac-8f37-4c8c38d1e455\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wg72m" Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.657992 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49bf0a46-b184-40ac-8f37-4c8c38d1e455-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wg72m\" (UID: \"49bf0a46-b184-40ac-8f37-4c8c38d1e455\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wg72m" Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.658303 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n9nx\" (UniqueName: \"kubernetes.io/projected/49bf0a46-b184-40ac-8f37-4c8c38d1e455-kube-api-access-8n9nx\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wg72m\" (UID: \"49bf0a46-b184-40ac-8f37-4c8c38d1e455\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wg72m" Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.658562 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49bf0a46-b184-40ac-8f37-4c8c38d1e455-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wg72m\" (UID: \"49bf0a46-b184-40ac-8f37-4c8c38d1e455\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wg72m" Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.663384 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49bf0a46-b184-40ac-8f37-4c8c38d1e455-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wg72m\" (UID: \"49bf0a46-b184-40ac-8f37-4c8c38d1e455\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wg72m" Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.663449 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49bf0a46-b184-40ac-8f37-4c8c38d1e455-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wg72m\" (UID: \"49bf0a46-b184-40ac-8f37-4c8c38d1e455\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wg72m" Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.687401 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n9nx\" (UniqueName: \"kubernetes.io/projected/49bf0a46-b184-40ac-8f37-4c8c38d1e455-kube-api-access-8n9nx\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wg72m\" (UID: \"49bf0a46-b184-40ac-8f37-4c8c38d1e455\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wg72m" Dec 01 20:29:36 crc kubenswrapper[4802]: I1201 20:29:36.837847 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wg72m" Dec 01 20:29:37 crc kubenswrapper[4802]: I1201 20:29:37.397797 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-wg72m"] Dec 01 20:29:38 crc kubenswrapper[4802]: I1201 20:29:38.394072 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wg72m" event={"ID":"49bf0a46-b184-40ac-8f37-4c8c38d1e455","Type":"ContainerStarted","Data":"0f4ff116b520e878f5afcd628b57664fc0a65580894d364a5d724670691557a9"} Dec 01 20:29:38 crc kubenswrapper[4802]: I1201 20:29:38.394568 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wg72m" event={"ID":"49bf0a46-b184-40ac-8f37-4c8c38d1e455","Type":"ContainerStarted","Data":"8033f133b0722c723a83a7c64f53d6c7af94639010a9a1dc2c884edd51ded0c4"} Dec 01 20:29:38 crc kubenswrapper[4802]: I1201 20:29:38.420763 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wg72m" podStartSLOduration=1.735110718 podStartE2EDuration="2.420742973s" podCreationTimestamp="2025-12-01 20:29:36 +0000 UTC" firstStartedPulling="2025-12-01 20:29:37.414258242 +0000 UTC m=+1998.976817883" lastFinishedPulling="2025-12-01 20:29:38.099890467 +0000 UTC m=+1999.662450138" observedRunningTime="2025-12-01 20:29:38.412827665 +0000 UTC m=+1999.975387326" watchObservedRunningTime="2025-12-01 20:29:38.420742973 +0000 UTC m=+1999.983302614" Dec 01 20:29:47 crc kubenswrapper[4802]: I1201 20:29:47.489917 4802 generic.go:334] "Generic (PLEG): container finished" podID="49bf0a46-b184-40ac-8f37-4c8c38d1e455" containerID="0f4ff116b520e878f5afcd628b57664fc0a65580894d364a5d724670691557a9" exitCode=0 Dec 01 20:29:47 crc kubenswrapper[4802]: I1201 20:29:47.489987 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wg72m" event={"ID":"49bf0a46-b184-40ac-8f37-4c8c38d1e455","Type":"ContainerDied","Data":"0f4ff116b520e878f5afcd628b57664fc0a65580894d364a5d724670691557a9"} Dec 01 20:29:48 crc kubenswrapper[4802]: I1201 20:29:48.056561 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-pwkmp"] Dec 01 20:29:48 crc kubenswrapper[4802]: I1201 20:29:48.063949 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-pwkmp"] Dec 01 20:29:48 crc kubenswrapper[4802]: I1201 20:29:48.737485 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c58f6525-0009-410b-ba8d-56e7256e9d32" path="/var/lib/kubelet/pods/c58f6525-0009-410b-ba8d-56e7256e9d32/volumes" Dec 01 20:29:48 crc kubenswrapper[4802]: I1201 20:29:48.956009 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wg72m" Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.029371 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-x9d74"] Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.039594 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-x9d74"] Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.100344 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49bf0a46-b184-40ac-8f37-4c8c38d1e455-ssh-key\") pod \"49bf0a46-b184-40ac-8f37-4c8c38d1e455\" (UID: \"49bf0a46-b184-40ac-8f37-4c8c38d1e455\") " Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.100663 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49bf0a46-b184-40ac-8f37-4c8c38d1e455-inventory\") pod \"49bf0a46-b184-40ac-8f37-4c8c38d1e455\" (UID: \"49bf0a46-b184-40ac-8f37-4c8c38d1e455\") " Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.100748 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n9nx\" (UniqueName: \"kubernetes.io/projected/49bf0a46-b184-40ac-8f37-4c8c38d1e455-kube-api-access-8n9nx\") pod \"49bf0a46-b184-40ac-8f37-4c8c38d1e455\" (UID: \"49bf0a46-b184-40ac-8f37-4c8c38d1e455\") " Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.106238 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49bf0a46-b184-40ac-8f37-4c8c38d1e455-kube-api-access-8n9nx" (OuterVolumeSpecName: "kube-api-access-8n9nx") pod "49bf0a46-b184-40ac-8f37-4c8c38d1e455" (UID: "49bf0a46-b184-40ac-8f37-4c8c38d1e455"). InnerVolumeSpecName "kube-api-access-8n9nx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.125946 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49bf0a46-b184-40ac-8f37-4c8c38d1e455-inventory" (OuterVolumeSpecName: "inventory") pod "49bf0a46-b184-40ac-8f37-4c8c38d1e455" (UID: "49bf0a46-b184-40ac-8f37-4c8c38d1e455"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.128160 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49bf0a46-b184-40ac-8f37-4c8c38d1e455-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "49bf0a46-b184-40ac-8f37-4c8c38d1e455" (UID: "49bf0a46-b184-40ac-8f37-4c8c38d1e455"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.203425 4802 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49bf0a46-b184-40ac-8f37-4c8c38d1e455-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.203451 4802 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49bf0a46-b184-40ac-8f37-4c8c38d1e455-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.203461 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n9nx\" (UniqueName: \"kubernetes.io/projected/49bf0a46-b184-40ac-8f37-4c8c38d1e455-kube-api-access-8n9nx\") on node \"crc\" DevicePath \"\"" Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.512904 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wg72m" event={"ID":"49bf0a46-b184-40ac-8f37-4c8c38d1e455","Type":"ContainerDied","Data":"8033f133b0722c723a83a7c64f53d6c7af94639010a9a1dc2c884edd51ded0c4"} Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.512956 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8033f133b0722c723a83a7c64f53d6c7af94639010a9a1dc2c884edd51ded0c4" Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.513026 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wg72m" Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.599768 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q"] Dec 01 20:29:49 crc kubenswrapper[4802]: E1201 20:29:49.600883 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49bf0a46-b184-40ac-8f37-4c8c38d1e455" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.600919 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="49bf0a46-b184-40ac-8f37-4c8c38d1e455" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.601275 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="49bf0a46-b184-40ac-8f37-4c8c38d1e455" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.602200 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q" Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.606185 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.606408 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wx8v9" Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.607505 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.607891 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.614781 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q"] Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.711366 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65sd2\" (UniqueName: \"kubernetes.io/projected/567bc985-ed80-43a1-8063-471d500ed6b6-kube-api-access-65sd2\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q\" (UID: \"567bc985-ed80-43a1-8063-471d500ed6b6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q" Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.711412 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/567bc985-ed80-43a1-8063-471d500ed6b6-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q\" (UID: \"567bc985-ed80-43a1-8063-471d500ed6b6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q" Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.714051 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/567bc985-ed80-43a1-8063-471d500ed6b6-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q\" (UID: \"567bc985-ed80-43a1-8063-471d500ed6b6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q" Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.815340 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65sd2\" (UniqueName: \"kubernetes.io/projected/567bc985-ed80-43a1-8063-471d500ed6b6-kube-api-access-65sd2\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q\" (UID: \"567bc985-ed80-43a1-8063-471d500ed6b6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q" Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.815382 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/567bc985-ed80-43a1-8063-471d500ed6b6-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q\" (UID: \"567bc985-ed80-43a1-8063-471d500ed6b6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q" Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.815460 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/567bc985-ed80-43a1-8063-471d500ed6b6-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q\" (UID: \"567bc985-ed80-43a1-8063-471d500ed6b6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q" Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.819774 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/567bc985-ed80-43a1-8063-471d500ed6b6-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q\" (UID: \"567bc985-ed80-43a1-8063-471d500ed6b6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q" Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.822536 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/567bc985-ed80-43a1-8063-471d500ed6b6-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q\" (UID: \"567bc985-ed80-43a1-8063-471d500ed6b6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q" Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.838706 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65sd2\" (UniqueName: \"kubernetes.io/projected/567bc985-ed80-43a1-8063-471d500ed6b6-kube-api-access-65sd2\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q\" (UID: \"567bc985-ed80-43a1-8063-471d500ed6b6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q" Dec 01 20:29:49 crc kubenswrapper[4802]: I1201 20:29:49.927701 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q" Dec 01 20:29:50 crc kubenswrapper[4802]: I1201 20:29:50.458757 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q"] Dec 01 20:29:50 crc kubenswrapper[4802]: I1201 20:29:50.524630 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q" event={"ID":"567bc985-ed80-43a1-8063-471d500ed6b6","Type":"ContainerStarted","Data":"dadfd2c1b412d8f5e7d36e7cac0a727e3e432e75e6a8461e406e6238400c5c54"} Dec 01 20:29:50 crc kubenswrapper[4802]: I1201 20:29:50.730747 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffbb23b4-8984-4d8f-8301-81c25799727d" path="/var/lib/kubelet/pods/ffbb23b4-8984-4d8f-8301-81c25799727d/volumes" Dec 01 20:29:51 crc kubenswrapper[4802]: I1201 20:29:51.535811 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q" event={"ID":"567bc985-ed80-43a1-8063-471d500ed6b6","Type":"ContainerStarted","Data":"698d459d0ded6c9b623351b9d47258461f3bf1cfbffa744ef5a1b015e67a7c56"} Dec 01 20:29:51 crc kubenswrapper[4802]: I1201 20:29:51.560312 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q" podStartSLOduration=1.764932191 podStartE2EDuration="2.560277195s" podCreationTimestamp="2025-12-01 20:29:49 +0000 UTC" firstStartedPulling="2025-12-01 20:29:50.469736918 +0000 UTC m=+2012.032296599" lastFinishedPulling="2025-12-01 20:29:51.265081962 +0000 UTC m=+2012.827641603" observedRunningTime="2025-12-01 20:29:51.552926196 +0000 UTC m=+2013.115485877" watchObservedRunningTime="2025-12-01 20:29:51.560277195 +0000 UTC m=+2013.122836906" Dec 01 20:29:58 crc kubenswrapper[4802]: I1201 20:29:58.088716 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:29:58 crc kubenswrapper[4802]: I1201 20:29:58.089397 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:30:00 crc kubenswrapper[4802]: I1201 20:30:00.137102 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410350-wwt7s"] Dec 01 20:30:00 crc kubenswrapper[4802]: I1201 20:30:00.138513 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410350-wwt7s" Dec 01 20:30:00 crc kubenswrapper[4802]: I1201 20:30:00.144712 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 20:30:00 crc kubenswrapper[4802]: I1201 20:30:00.146921 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 20:30:00 crc kubenswrapper[4802]: I1201 20:30:00.158387 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410350-wwt7s"] Dec 01 20:30:00 crc kubenswrapper[4802]: I1201 20:30:00.327954 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d4be394-231f-4c82-81ec-6fe168ffc485-secret-volume\") pod \"collect-profiles-29410350-wwt7s\" (UID: \"7d4be394-231f-4c82-81ec-6fe168ffc485\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410350-wwt7s" Dec 01 20:30:00 crc kubenswrapper[4802]: I1201 20:30:00.328027 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d4be394-231f-4c82-81ec-6fe168ffc485-config-volume\") pod \"collect-profiles-29410350-wwt7s\" (UID: \"7d4be394-231f-4c82-81ec-6fe168ffc485\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410350-wwt7s" Dec 01 20:30:00 crc kubenswrapper[4802]: I1201 20:30:00.328774 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rzgv\" (UniqueName: \"kubernetes.io/projected/7d4be394-231f-4c82-81ec-6fe168ffc485-kube-api-access-8rzgv\") pod \"collect-profiles-29410350-wwt7s\" (UID: \"7d4be394-231f-4c82-81ec-6fe168ffc485\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410350-wwt7s" Dec 01 20:30:00 crc kubenswrapper[4802]: I1201 20:30:00.431565 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d4be394-231f-4c82-81ec-6fe168ffc485-secret-volume\") pod \"collect-profiles-29410350-wwt7s\" (UID: \"7d4be394-231f-4c82-81ec-6fe168ffc485\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410350-wwt7s" Dec 01 20:30:00 crc kubenswrapper[4802]: I1201 20:30:00.431640 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d4be394-231f-4c82-81ec-6fe168ffc485-config-volume\") pod \"collect-profiles-29410350-wwt7s\" (UID: \"7d4be394-231f-4c82-81ec-6fe168ffc485\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410350-wwt7s" Dec 01 20:30:00 crc kubenswrapper[4802]: I1201 20:30:00.431725 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rzgv\" (UniqueName: \"kubernetes.io/projected/7d4be394-231f-4c82-81ec-6fe168ffc485-kube-api-access-8rzgv\") pod \"collect-profiles-29410350-wwt7s\" (UID: \"7d4be394-231f-4c82-81ec-6fe168ffc485\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410350-wwt7s" Dec 01 20:30:00 crc kubenswrapper[4802]: I1201 20:30:00.432710 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d4be394-231f-4c82-81ec-6fe168ffc485-config-volume\") pod \"collect-profiles-29410350-wwt7s\" (UID: \"7d4be394-231f-4c82-81ec-6fe168ffc485\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410350-wwt7s" Dec 01 20:30:00 crc kubenswrapper[4802]: I1201 20:30:00.444370 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d4be394-231f-4c82-81ec-6fe168ffc485-secret-volume\") pod \"collect-profiles-29410350-wwt7s\" (UID: \"7d4be394-231f-4c82-81ec-6fe168ffc485\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410350-wwt7s" Dec 01 20:30:00 crc kubenswrapper[4802]: I1201 20:30:00.451833 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rzgv\" (UniqueName: \"kubernetes.io/projected/7d4be394-231f-4c82-81ec-6fe168ffc485-kube-api-access-8rzgv\") pod \"collect-profiles-29410350-wwt7s\" (UID: \"7d4be394-231f-4c82-81ec-6fe168ffc485\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410350-wwt7s" Dec 01 20:30:00 crc kubenswrapper[4802]: I1201 20:30:00.469946 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410350-wwt7s" Dec 01 20:30:00 crc kubenswrapper[4802]: I1201 20:30:00.951350 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410350-wwt7s"] Dec 01 20:30:01 crc kubenswrapper[4802]: I1201 20:30:01.648173 4802 generic.go:334] "Generic (PLEG): container finished" podID="7d4be394-231f-4c82-81ec-6fe168ffc485" containerID="d719b68d93a4019f6fef774a39b751ba8a09b5124f937e8658f57983969a6bf3" exitCode=0 Dec 01 20:30:01 crc kubenswrapper[4802]: I1201 20:30:01.648239 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410350-wwt7s" event={"ID":"7d4be394-231f-4c82-81ec-6fe168ffc485","Type":"ContainerDied","Data":"d719b68d93a4019f6fef774a39b751ba8a09b5124f937e8658f57983969a6bf3"} Dec 01 20:30:01 crc kubenswrapper[4802]: I1201 20:30:01.648548 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410350-wwt7s" event={"ID":"7d4be394-231f-4c82-81ec-6fe168ffc485","Type":"ContainerStarted","Data":"bb1ca475582c0fb16855271542b2f4b4632f252eacbb14e6a9365cb467a3110d"} Dec 01 20:30:02 crc kubenswrapper[4802]: I1201 20:30:02.658754 4802 generic.go:334] "Generic (PLEG): container finished" podID="567bc985-ed80-43a1-8063-471d500ed6b6" containerID="698d459d0ded6c9b623351b9d47258461f3bf1cfbffa744ef5a1b015e67a7c56" exitCode=0 Dec 01 20:30:02 crc kubenswrapper[4802]: I1201 20:30:02.658933 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q" event={"ID":"567bc985-ed80-43a1-8063-471d500ed6b6","Type":"ContainerDied","Data":"698d459d0ded6c9b623351b9d47258461f3bf1cfbffa744ef5a1b015e67a7c56"} Dec 01 20:30:02 crc kubenswrapper[4802]: I1201 20:30:02.992652 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410350-wwt7s" Dec 01 20:30:03 crc kubenswrapper[4802]: I1201 20:30:03.184468 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d4be394-231f-4c82-81ec-6fe168ffc485-config-volume\") pod \"7d4be394-231f-4c82-81ec-6fe168ffc485\" (UID: \"7d4be394-231f-4c82-81ec-6fe168ffc485\") " Dec 01 20:30:03 crc kubenswrapper[4802]: I1201 20:30:03.184609 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d4be394-231f-4c82-81ec-6fe168ffc485-secret-volume\") pod \"7d4be394-231f-4c82-81ec-6fe168ffc485\" (UID: \"7d4be394-231f-4c82-81ec-6fe168ffc485\") " Dec 01 20:30:03 crc kubenswrapper[4802]: I1201 20:30:03.184654 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rzgv\" (UniqueName: \"kubernetes.io/projected/7d4be394-231f-4c82-81ec-6fe168ffc485-kube-api-access-8rzgv\") pod \"7d4be394-231f-4c82-81ec-6fe168ffc485\" (UID: \"7d4be394-231f-4c82-81ec-6fe168ffc485\") " Dec 01 20:30:03 crc kubenswrapper[4802]: I1201 20:30:03.185804 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d4be394-231f-4c82-81ec-6fe168ffc485-config-volume" (OuterVolumeSpecName: "config-volume") pod "7d4be394-231f-4c82-81ec-6fe168ffc485" (UID: "7d4be394-231f-4c82-81ec-6fe168ffc485"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:30:03 crc kubenswrapper[4802]: I1201 20:30:03.190267 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d4be394-231f-4c82-81ec-6fe168ffc485-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7d4be394-231f-4c82-81ec-6fe168ffc485" (UID: "7d4be394-231f-4c82-81ec-6fe168ffc485"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:30:03 crc kubenswrapper[4802]: I1201 20:30:03.190475 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d4be394-231f-4c82-81ec-6fe168ffc485-kube-api-access-8rzgv" (OuterVolumeSpecName: "kube-api-access-8rzgv") pod "7d4be394-231f-4c82-81ec-6fe168ffc485" (UID: "7d4be394-231f-4c82-81ec-6fe168ffc485"). InnerVolumeSpecName "kube-api-access-8rzgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:30:03 crc kubenswrapper[4802]: I1201 20:30:03.287139 4802 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d4be394-231f-4c82-81ec-6fe168ffc485-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 20:30:03 crc kubenswrapper[4802]: I1201 20:30:03.287397 4802 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d4be394-231f-4c82-81ec-6fe168ffc485-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 20:30:03 crc kubenswrapper[4802]: I1201 20:30:03.287457 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rzgv\" (UniqueName: \"kubernetes.io/projected/7d4be394-231f-4c82-81ec-6fe168ffc485-kube-api-access-8rzgv\") on node \"crc\" DevicePath \"\"" Dec 01 20:30:03 crc kubenswrapper[4802]: I1201 20:30:03.669369 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410350-wwt7s" event={"ID":"7d4be394-231f-4c82-81ec-6fe168ffc485","Type":"ContainerDied","Data":"bb1ca475582c0fb16855271542b2f4b4632f252eacbb14e6a9365cb467a3110d"} Dec 01 20:30:03 crc kubenswrapper[4802]: I1201 20:30:03.669439 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb1ca475582c0fb16855271542b2f4b4632f252eacbb14e6a9365cb467a3110d" Dec 01 20:30:03 crc kubenswrapper[4802]: I1201 20:30:03.669388 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410350-wwt7s" Dec 01 20:30:04 crc kubenswrapper[4802]: I1201 20:30:04.070450 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410305-db4dv"] Dec 01 20:30:04 crc kubenswrapper[4802]: I1201 20:30:04.079053 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410305-db4dv"] Dec 01 20:30:04 crc kubenswrapper[4802]: I1201 20:30:04.109546 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q" Dec 01 20:30:04 crc kubenswrapper[4802]: I1201 20:30:04.202534 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65sd2\" (UniqueName: \"kubernetes.io/projected/567bc985-ed80-43a1-8063-471d500ed6b6-kube-api-access-65sd2\") pod \"567bc985-ed80-43a1-8063-471d500ed6b6\" (UID: \"567bc985-ed80-43a1-8063-471d500ed6b6\") " Dec 01 20:30:04 crc kubenswrapper[4802]: I1201 20:30:04.202570 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/567bc985-ed80-43a1-8063-471d500ed6b6-ssh-key\") pod \"567bc985-ed80-43a1-8063-471d500ed6b6\" (UID: \"567bc985-ed80-43a1-8063-471d500ed6b6\") " Dec 01 20:30:04 crc kubenswrapper[4802]: I1201 20:30:04.202862 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/567bc985-ed80-43a1-8063-471d500ed6b6-inventory\") pod \"567bc985-ed80-43a1-8063-471d500ed6b6\" (UID: \"567bc985-ed80-43a1-8063-471d500ed6b6\") " Dec 01 20:30:04 crc kubenswrapper[4802]: I1201 20:30:04.207679 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/567bc985-ed80-43a1-8063-471d500ed6b6-kube-api-access-65sd2" (OuterVolumeSpecName: "kube-api-access-65sd2") pod "567bc985-ed80-43a1-8063-471d500ed6b6" (UID: "567bc985-ed80-43a1-8063-471d500ed6b6"). InnerVolumeSpecName "kube-api-access-65sd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:30:04 crc kubenswrapper[4802]: I1201 20:30:04.231065 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/567bc985-ed80-43a1-8063-471d500ed6b6-inventory" (OuterVolumeSpecName: "inventory") pod "567bc985-ed80-43a1-8063-471d500ed6b6" (UID: "567bc985-ed80-43a1-8063-471d500ed6b6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:30:04 crc kubenswrapper[4802]: I1201 20:30:04.233071 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/567bc985-ed80-43a1-8063-471d500ed6b6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "567bc985-ed80-43a1-8063-471d500ed6b6" (UID: "567bc985-ed80-43a1-8063-471d500ed6b6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:30:04 crc kubenswrapper[4802]: I1201 20:30:04.304388 4802 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/567bc985-ed80-43a1-8063-471d500ed6b6-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 20:30:04 crc kubenswrapper[4802]: I1201 20:30:04.304421 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65sd2\" (UniqueName: \"kubernetes.io/projected/567bc985-ed80-43a1-8063-471d500ed6b6-kube-api-access-65sd2\") on node \"crc\" DevicePath \"\"" Dec 01 20:30:04 crc kubenswrapper[4802]: I1201 20:30:04.304431 4802 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/567bc985-ed80-43a1-8063-471d500ed6b6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 20:30:04 crc kubenswrapper[4802]: I1201 20:30:04.680275 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q" event={"ID":"567bc985-ed80-43a1-8063-471d500ed6b6","Type":"ContainerDied","Data":"dadfd2c1b412d8f5e7d36e7cac0a727e3e432e75e6a8461e406e6238400c5c54"} Dec 01 20:30:04 crc kubenswrapper[4802]: I1201 20:30:04.680309 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dadfd2c1b412d8f5e7d36e7cac0a727e3e432e75e6a8461e406e6238400c5c54" Dec 01 20:30:04 crc kubenswrapper[4802]: I1201 20:30:04.680356 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q" Dec 01 20:30:04 crc kubenswrapper[4802]: I1201 20:30:04.734776 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="081dccc6-dbee-40a9-8333-d1c178e3fab3" path="/var/lib/kubelet/pods/081dccc6-dbee-40a9-8333-d1c178e3fab3/volumes" Dec 01 20:30:25 crc kubenswrapper[4802]: I1201 20:30:25.487736 4802 scope.go:117] "RemoveContainer" containerID="14ba7088961527c5c9887e5909d2898863b4b063e7bb01065f35614ce104b714" Dec 01 20:30:25 crc kubenswrapper[4802]: I1201 20:30:25.511817 4802 scope.go:117] "RemoveContainer" containerID="280ebc12bc9cce6b2e9e70d1c3369d3c6a845c79e2cdc14e43e4779917952ab0" Dec 01 20:30:25 crc kubenswrapper[4802]: I1201 20:30:25.599529 4802 scope.go:117] "RemoveContainer" containerID="bc1456a0928d6b42f7cc78973ebf42570d5bd634528b094e894c37af799ab3a8" Dec 01 20:30:25 crc kubenswrapper[4802]: I1201 20:30:25.661601 4802 scope.go:117] "RemoveContainer" containerID="63a09e19a16e3d246e87761831ab8a56de058b847c8ffab420ed16ab34b2dc0a" Dec 01 20:30:28 crc kubenswrapper[4802]: I1201 20:30:28.088584 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:30:28 crc kubenswrapper[4802]: I1201 20:30:28.089131 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:30:28 crc kubenswrapper[4802]: I1201 20:30:28.089166 4802 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 20:30:28 crc kubenswrapper[4802]: I1201 20:30:28.089859 4802 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"26cf5483ade0dbc95267728676ce0ccc960449233741d088558ac81f81de5fa8"} pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 20:30:28 crc kubenswrapper[4802]: I1201 20:30:28.089918 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" containerID="cri-o://26cf5483ade0dbc95267728676ce0ccc960449233741d088558ac81f81de5fa8" gracePeriod=600 Dec 01 20:30:28 crc kubenswrapper[4802]: I1201 20:30:28.938846 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerDied","Data":"26cf5483ade0dbc95267728676ce0ccc960449233741d088558ac81f81de5fa8"} Dec 01 20:30:28 crc kubenswrapper[4802]: I1201 20:30:28.939423 4802 scope.go:117] "RemoveContainer" containerID="ea229b2e7cc2b396a7f9bb760287c217dca1e15b2b033f8a84bbb0e3d5464619" Dec 01 20:30:28 crc kubenswrapper[4802]: I1201 20:30:28.938885 4802 generic.go:334] "Generic (PLEG): container finished" podID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerID="26cf5483ade0dbc95267728676ce0ccc960449233741d088558ac81f81de5fa8" exitCode=0 Dec 01 20:30:28 crc kubenswrapper[4802]: I1201 20:30:28.939533 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerStarted","Data":"e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88"} Dec 01 20:30:34 crc kubenswrapper[4802]: I1201 20:30:34.037987 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-wb4qn"] Dec 01 20:30:34 crc kubenswrapper[4802]: I1201 20:30:34.045182 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-wb4qn"] Dec 01 20:30:34 crc kubenswrapper[4802]: I1201 20:30:34.730608 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a77beefc-0880-4f76-b2d4-4b3b1b3c42e9" path="/var/lib/kubelet/pods/a77beefc-0880-4f76-b2d4-4b3b1b3c42e9/volumes" Dec 01 20:31:25 crc kubenswrapper[4802]: I1201 20:31:25.814042 4802 scope.go:117] "RemoveContainer" containerID="262fe17bf5ba9453ade40441101279313abb7f5a6d1f8c92df72c8ab42e97d8c" Dec 01 20:31:33 crc kubenswrapper[4802]: I1201 20:31:33.821093 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w968c"] Dec 01 20:31:33 crc kubenswrapper[4802]: E1201 20:31:33.822106 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d4be394-231f-4c82-81ec-6fe168ffc485" containerName="collect-profiles" Dec 01 20:31:33 crc kubenswrapper[4802]: I1201 20:31:33.822123 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d4be394-231f-4c82-81ec-6fe168ffc485" containerName="collect-profiles" Dec 01 20:31:33 crc kubenswrapper[4802]: E1201 20:31:33.822134 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="567bc985-ed80-43a1-8063-471d500ed6b6" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 20:31:33 crc kubenswrapper[4802]: I1201 20:31:33.822142 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="567bc985-ed80-43a1-8063-471d500ed6b6" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 20:31:33 crc kubenswrapper[4802]: I1201 20:31:33.822357 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="567bc985-ed80-43a1-8063-471d500ed6b6" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 20:31:33 crc kubenswrapper[4802]: I1201 20:31:33.822383 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d4be394-231f-4c82-81ec-6fe168ffc485" containerName="collect-profiles" Dec 01 20:31:33 crc kubenswrapper[4802]: I1201 20:31:33.823620 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w968c" Dec 01 20:31:33 crc kubenswrapper[4802]: I1201 20:31:33.846876 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w968c"] Dec 01 20:31:33 crc kubenswrapper[4802]: I1201 20:31:33.940908 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nkc9\" (UniqueName: \"kubernetes.io/projected/4066f186-9cb5-45aa-b295-dffe32f3aa75-kube-api-access-9nkc9\") pod \"redhat-operators-w968c\" (UID: \"4066f186-9cb5-45aa-b295-dffe32f3aa75\") " pod="openshift-marketplace/redhat-operators-w968c" Dec 01 20:31:33 crc kubenswrapper[4802]: I1201 20:31:33.941031 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4066f186-9cb5-45aa-b295-dffe32f3aa75-utilities\") pod \"redhat-operators-w968c\" (UID: \"4066f186-9cb5-45aa-b295-dffe32f3aa75\") " pod="openshift-marketplace/redhat-operators-w968c" Dec 01 20:31:33 crc kubenswrapper[4802]: I1201 20:31:33.941096 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4066f186-9cb5-45aa-b295-dffe32f3aa75-catalog-content\") pod \"redhat-operators-w968c\" (UID: \"4066f186-9cb5-45aa-b295-dffe32f3aa75\") " pod="openshift-marketplace/redhat-operators-w968c" Dec 01 20:31:34 crc kubenswrapper[4802]: I1201 20:31:34.042554 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4066f186-9cb5-45aa-b295-dffe32f3aa75-utilities\") pod \"redhat-operators-w968c\" (UID: \"4066f186-9cb5-45aa-b295-dffe32f3aa75\") " pod="openshift-marketplace/redhat-operators-w968c" Dec 01 20:31:34 crc kubenswrapper[4802]: I1201 20:31:34.042638 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4066f186-9cb5-45aa-b295-dffe32f3aa75-catalog-content\") pod \"redhat-operators-w968c\" (UID: \"4066f186-9cb5-45aa-b295-dffe32f3aa75\") " pod="openshift-marketplace/redhat-operators-w968c" Dec 01 20:31:34 crc kubenswrapper[4802]: I1201 20:31:34.042668 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nkc9\" (UniqueName: \"kubernetes.io/projected/4066f186-9cb5-45aa-b295-dffe32f3aa75-kube-api-access-9nkc9\") pod \"redhat-operators-w968c\" (UID: \"4066f186-9cb5-45aa-b295-dffe32f3aa75\") " pod="openshift-marketplace/redhat-operators-w968c" Dec 01 20:31:34 crc kubenswrapper[4802]: I1201 20:31:34.043348 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4066f186-9cb5-45aa-b295-dffe32f3aa75-catalog-content\") pod \"redhat-operators-w968c\" (UID: \"4066f186-9cb5-45aa-b295-dffe32f3aa75\") " pod="openshift-marketplace/redhat-operators-w968c" Dec 01 20:31:34 crc kubenswrapper[4802]: I1201 20:31:34.043436 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4066f186-9cb5-45aa-b295-dffe32f3aa75-utilities\") pod \"redhat-operators-w968c\" (UID: \"4066f186-9cb5-45aa-b295-dffe32f3aa75\") " pod="openshift-marketplace/redhat-operators-w968c" Dec 01 20:31:34 crc kubenswrapper[4802]: I1201 20:31:34.066669 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nkc9\" (UniqueName: \"kubernetes.io/projected/4066f186-9cb5-45aa-b295-dffe32f3aa75-kube-api-access-9nkc9\") pod \"redhat-operators-w968c\" (UID: \"4066f186-9cb5-45aa-b295-dffe32f3aa75\") " pod="openshift-marketplace/redhat-operators-w968c" Dec 01 20:31:34 crc kubenswrapper[4802]: I1201 20:31:34.160754 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w968c" Dec 01 20:31:34 crc kubenswrapper[4802]: I1201 20:31:34.615470 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w968c"] Dec 01 20:31:35 crc kubenswrapper[4802]: I1201 20:31:35.542974 4802 generic.go:334] "Generic (PLEG): container finished" podID="4066f186-9cb5-45aa-b295-dffe32f3aa75" containerID="074c56805b3b71094abad42300a04c30c86ccdb8247b298f87221874d6de0454" exitCode=0 Dec 01 20:31:35 crc kubenswrapper[4802]: I1201 20:31:35.543045 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w968c" event={"ID":"4066f186-9cb5-45aa-b295-dffe32f3aa75","Type":"ContainerDied","Data":"074c56805b3b71094abad42300a04c30c86ccdb8247b298f87221874d6de0454"} Dec 01 20:31:35 crc kubenswrapper[4802]: I1201 20:31:35.544415 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w968c" event={"ID":"4066f186-9cb5-45aa-b295-dffe32f3aa75","Type":"ContainerStarted","Data":"ea4554e11694ed612f7bb0232e83294804590e3bbfbda33a47b0d934b5dc2f99"} Dec 01 20:31:35 crc kubenswrapper[4802]: I1201 20:31:35.545353 4802 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 20:31:37 crc kubenswrapper[4802]: I1201 20:31:37.563846 4802 generic.go:334] "Generic (PLEG): container finished" podID="4066f186-9cb5-45aa-b295-dffe32f3aa75" containerID="36b733f34958f032d07442448f8ff1a0d25791aa9b0f88c6ac214b8e4fea09c9" exitCode=0 Dec 01 20:31:37 crc kubenswrapper[4802]: I1201 20:31:37.563965 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w968c" event={"ID":"4066f186-9cb5-45aa-b295-dffe32f3aa75","Type":"ContainerDied","Data":"36b733f34958f032d07442448f8ff1a0d25791aa9b0f88c6ac214b8e4fea09c9"} Dec 01 20:31:38 crc kubenswrapper[4802]: I1201 20:31:38.576630 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w968c" event={"ID":"4066f186-9cb5-45aa-b295-dffe32f3aa75","Type":"ContainerStarted","Data":"05ef5c05c8b524f534a59b7d31d1fba09e00e3a3d4fd6e71c5908b600a32d522"} Dec 01 20:31:38 crc kubenswrapper[4802]: I1201 20:31:38.605316 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w968c" podStartSLOduration=3.136059302 podStartE2EDuration="5.605293034s" podCreationTimestamp="2025-12-01 20:31:33 +0000 UTC" firstStartedPulling="2025-12-01 20:31:35.545076048 +0000 UTC m=+2117.107635689" lastFinishedPulling="2025-12-01 20:31:38.01430974 +0000 UTC m=+2119.576869421" observedRunningTime="2025-12-01 20:31:38.593819264 +0000 UTC m=+2120.156378915" watchObservedRunningTime="2025-12-01 20:31:38.605293034 +0000 UTC m=+2120.167852695" Dec 01 20:31:44 crc kubenswrapper[4802]: I1201 20:31:44.166123 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w968c" Dec 01 20:31:44 crc kubenswrapper[4802]: I1201 20:31:44.167377 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w968c" Dec 01 20:31:44 crc kubenswrapper[4802]: I1201 20:31:44.212590 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w968c" Dec 01 20:31:44 crc kubenswrapper[4802]: I1201 20:31:44.662280 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w968c" Dec 01 20:31:44 crc kubenswrapper[4802]: I1201 20:31:44.710925 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w968c"] Dec 01 20:31:46 crc kubenswrapper[4802]: I1201 20:31:46.639053 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w968c" podUID="4066f186-9cb5-45aa-b295-dffe32f3aa75" containerName="registry-server" containerID="cri-o://05ef5c05c8b524f534a59b7d31d1fba09e00e3a3d4fd6e71c5908b600a32d522" gracePeriod=2 Dec 01 20:31:47 crc kubenswrapper[4802]: I1201 20:31:47.123891 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w968c" Dec 01 20:31:47 crc kubenswrapper[4802]: I1201 20:31:47.287093 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nkc9\" (UniqueName: \"kubernetes.io/projected/4066f186-9cb5-45aa-b295-dffe32f3aa75-kube-api-access-9nkc9\") pod \"4066f186-9cb5-45aa-b295-dffe32f3aa75\" (UID: \"4066f186-9cb5-45aa-b295-dffe32f3aa75\") " Dec 01 20:31:47 crc kubenswrapper[4802]: I1201 20:31:47.287310 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4066f186-9cb5-45aa-b295-dffe32f3aa75-utilities\") pod \"4066f186-9cb5-45aa-b295-dffe32f3aa75\" (UID: \"4066f186-9cb5-45aa-b295-dffe32f3aa75\") " Dec 01 20:31:47 crc kubenswrapper[4802]: I1201 20:31:47.287347 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4066f186-9cb5-45aa-b295-dffe32f3aa75-catalog-content\") pod \"4066f186-9cb5-45aa-b295-dffe32f3aa75\" (UID: \"4066f186-9cb5-45aa-b295-dffe32f3aa75\") " Dec 01 20:31:47 crc kubenswrapper[4802]: I1201 20:31:47.288786 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4066f186-9cb5-45aa-b295-dffe32f3aa75-utilities" (OuterVolumeSpecName: "utilities") pod "4066f186-9cb5-45aa-b295-dffe32f3aa75" (UID: "4066f186-9cb5-45aa-b295-dffe32f3aa75"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:31:47 crc kubenswrapper[4802]: I1201 20:31:47.294011 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4066f186-9cb5-45aa-b295-dffe32f3aa75-kube-api-access-9nkc9" (OuterVolumeSpecName: "kube-api-access-9nkc9") pod "4066f186-9cb5-45aa-b295-dffe32f3aa75" (UID: "4066f186-9cb5-45aa-b295-dffe32f3aa75"). InnerVolumeSpecName "kube-api-access-9nkc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:31:47 crc kubenswrapper[4802]: I1201 20:31:47.390095 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4066f186-9cb5-45aa-b295-dffe32f3aa75-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 20:31:47 crc kubenswrapper[4802]: I1201 20:31:47.390144 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nkc9\" (UniqueName: \"kubernetes.io/projected/4066f186-9cb5-45aa-b295-dffe32f3aa75-kube-api-access-9nkc9\") on node \"crc\" DevicePath \"\"" Dec 01 20:31:47 crc kubenswrapper[4802]: I1201 20:31:47.649104 4802 generic.go:334] "Generic (PLEG): container finished" podID="4066f186-9cb5-45aa-b295-dffe32f3aa75" containerID="05ef5c05c8b524f534a59b7d31d1fba09e00e3a3d4fd6e71c5908b600a32d522" exitCode=0 Dec 01 20:31:47 crc kubenswrapper[4802]: I1201 20:31:47.649162 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w968c" event={"ID":"4066f186-9cb5-45aa-b295-dffe32f3aa75","Type":"ContainerDied","Data":"05ef5c05c8b524f534a59b7d31d1fba09e00e3a3d4fd6e71c5908b600a32d522"} Dec 01 20:31:47 crc kubenswrapper[4802]: I1201 20:31:47.649436 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w968c" event={"ID":"4066f186-9cb5-45aa-b295-dffe32f3aa75","Type":"ContainerDied","Data":"ea4554e11694ed612f7bb0232e83294804590e3bbfbda33a47b0d934b5dc2f99"} Dec 01 20:31:47 crc kubenswrapper[4802]: I1201 20:31:47.649459 4802 scope.go:117] "RemoveContainer" containerID="05ef5c05c8b524f534a59b7d31d1fba09e00e3a3d4fd6e71c5908b600a32d522" Dec 01 20:31:47 crc kubenswrapper[4802]: I1201 20:31:47.649217 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w968c" Dec 01 20:31:47 crc kubenswrapper[4802]: I1201 20:31:47.668722 4802 scope.go:117] "RemoveContainer" containerID="36b733f34958f032d07442448f8ff1a0d25791aa9b0f88c6ac214b8e4fea09c9" Dec 01 20:31:47 crc kubenswrapper[4802]: I1201 20:31:47.692460 4802 scope.go:117] "RemoveContainer" containerID="074c56805b3b71094abad42300a04c30c86ccdb8247b298f87221874d6de0454" Dec 01 20:31:47 crc kubenswrapper[4802]: I1201 20:31:47.740458 4802 scope.go:117] "RemoveContainer" containerID="05ef5c05c8b524f534a59b7d31d1fba09e00e3a3d4fd6e71c5908b600a32d522" Dec 01 20:31:47 crc kubenswrapper[4802]: E1201 20:31:47.740835 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05ef5c05c8b524f534a59b7d31d1fba09e00e3a3d4fd6e71c5908b600a32d522\": container with ID starting with 05ef5c05c8b524f534a59b7d31d1fba09e00e3a3d4fd6e71c5908b600a32d522 not found: ID does not exist" containerID="05ef5c05c8b524f534a59b7d31d1fba09e00e3a3d4fd6e71c5908b600a32d522" Dec 01 20:31:47 crc kubenswrapper[4802]: I1201 20:31:47.740870 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05ef5c05c8b524f534a59b7d31d1fba09e00e3a3d4fd6e71c5908b600a32d522"} err="failed to get container status \"05ef5c05c8b524f534a59b7d31d1fba09e00e3a3d4fd6e71c5908b600a32d522\": rpc error: code = NotFound desc = could not find container \"05ef5c05c8b524f534a59b7d31d1fba09e00e3a3d4fd6e71c5908b600a32d522\": container with ID starting with 05ef5c05c8b524f534a59b7d31d1fba09e00e3a3d4fd6e71c5908b600a32d522 not found: ID does not exist" Dec 01 20:31:47 crc kubenswrapper[4802]: I1201 20:31:47.740899 4802 scope.go:117] "RemoveContainer" containerID="36b733f34958f032d07442448f8ff1a0d25791aa9b0f88c6ac214b8e4fea09c9" Dec 01 20:31:47 crc kubenswrapper[4802]: E1201 20:31:47.741118 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36b733f34958f032d07442448f8ff1a0d25791aa9b0f88c6ac214b8e4fea09c9\": container with ID starting with 36b733f34958f032d07442448f8ff1a0d25791aa9b0f88c6ac214b8e4fea09c9 not found: ID does not exist" containerID="36b733f34958f032d07442448f8ff1a0d25791aa9b0f88c6ac214b8e4fea09c9" Dec 01 20:31:47 crc kubenswrapper[4802]: I1201 20:31:47.741150 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36b733f34958f032d07442448f8ff1a0d25791aa9b0f88c6ac214b8e4fea09c9"} err="failed to get container status \"36b733f34958f032d07442448f8ff1a0d25791aa9b0f88c6ac214b8e4fea09c9\": rpc error: code = NotFound desc = could not find container \"36b733f34958f032d07442448f8ff1a0d25791aa9b0f88c6ac214b8e4fea09c9\": container with ID starting with 36b733f34958f032d07442448f8ff1a0d25791aa9b0f88c6ac214b8e4fea09c9 not found: ID does not exist" Dec 01 20:31:47 crc kubenswrapper[4802]: I1201 20:31:47.741170 4802 scope.go:117] "RemoveContainer" containerID="074c56805b3b71094abad42300a04c30c86ccdb8247b298f87221874d6de0454" Dec 01 20:31:47 crc kubenswrapper[4802]: E1201 20:31:47.741480 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"074c56805b3b71094abad42300a04c30c86ccdb8247b298f87221874d6de0454\": container with ID starting with 074c56805b3b71094abad42300a04c30c86ccdb8247b298f87221874d6de0454 not found: ID does not exist" containerID="074c56805b3b71094abad42300a04c30c86ccdb8247b298f87221874d6de0454" Dec 01 20:31:47 crc kubenswrapper[4802]: I1201 20:31:47.741538 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"074c56805b3b71094abad42300a04c30c86ccdb8247b298f87221874d6de0454"} err="failed to get container status \"074c56805b3b71094abad42300a04c30c86ccdb8247b298f87221874d6de0454\": rpc error: code = NotFound desc = could not find container \"074c56805b3b71094abad42300a04c30c86ccdb8247b298f87221874d6de0454\": container with ID starting with 074c56805b3b71094abad42300a04c30c86ccdb8247b298f87221874d6de0454 not found: ID does not exist" Dec 01 20:31:49 crc kubenswrapper[4802]: I1201 20:31:49.209910 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4066f186-9cb5-45aa-b295-dffe32f3aa75-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4066f186-9cb5-45aa-b295-dffe32f3aa75" (UID: "4066f186-9cb5-45aa-b295-dffe32f3aa75"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:31:49 crc kubenswrapper[4802]: I1201 20:31:49.222798 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4066f186-9cb5-45aa-b295-dffe32f3aa75-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 20:31:49 crc kubenswrapper[4802]: I1201 20:31:49.490792 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w968c"] Dec 01 20:31:49 crc kubenswrapper[4802]: I1201 20:31:49.500921 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w968c"] Dec 01 20:31:50 crc kubenswrapper[4802]: I1201 20:31:50.730857 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4066f186-9cb5-45aa-b295-dffe32f3aa75" path="/var/lib/kubelet/pods/4066f186-9cb5-45aa-b295-dffe32f3aa75/volumes" Dec 01 20:32:28 crc kubenswrapper[4802]: I1201 20:32:28.088913 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:32:28 crc kubenswrapper[4802]: I1201 20:32:28.091240 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:32:58 crc kubenswrapper[4802]: I1201 20:32:58.089187 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:32:58 crc kubenswrapper[4802]: I1201 20:32:58.089630 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.160091 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4str6"] Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.170147 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-jfldg"] Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.179835 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4str6"] Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.189539 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q"] Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.195740 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-jfldg"] Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.201911 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-wg72m"] Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.208740 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-wg72m"] Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.215106 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gx22q"] Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.221496 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs"] Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.227504 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rx5vs"] Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.234663 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gxg67"] Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.244357 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-8wwpr"] Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.251637 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk"] Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.257996 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf"] Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.264306 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx"] Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.270725 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-8wwpr"] Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.276014 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wbhsx"] Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.281340 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wmmkk"] Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.286959 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gxg67"] Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.292303 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dq5sf"] Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.738415 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06c88831-4819-4a0b-901c-4b00adfda2a4" path="/var/lib/kubelet/pods/06c88831-4819-4a0b-901c-4b00adfda2a4/volumes" Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.740160 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10691218-713c-4982-979e-cb0dde9077ed" path="/var/lib/kubelet/pods/10691218-713c-4982-979e-cb0dde9077ed/volumes" Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.741484 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c587bf8-2825-488a-8a22-d224dfdd6073" path="/var/lib/kubelet/pods/2c587bf8-2825-488a-8a22-d224dfdd6073/volumes" Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.743019 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d19da6a-24ea-49b2-9415-100c4db7bccd" path="/var/lib/kubelet/pods/3d19da6a-24ea-49b2-9415-100c4db7bccd/volumes" Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.745836 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49bf0a46-b184-40ac-8f37-4c8c38d1e455" path="/var/lib/kubelet/pods/49bf0a46-b184-40ac-8f37-4c8c38d1e455/volumes" Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.747011 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e509bad-279a-4da8-9f77-06123708e579" path="/var/lib/kubelet/pods/4e509bad-279a-4da8-9f77-06123708e579/volumes" Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.748151 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="567bc985-ed80-43a1-8063-471d500ed6b6" path="/var/lib/kubelet/pods/567bc985-ed80-43a1-8063-471d500ed6b6/volumes" Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.749788 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d276417-7411-4240-8260-5a976730047e" path="/var/lib/kubelet/pods/5d276417-7411-4240-8260-5a976730047e/volumes" Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.750440 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b86803dd-2f31-412c-bb40-44235c925362" path="/var/lib/kubelet/pods/b86803dd-2f31-412c-bb40-44235c925362/volumes" Dec 01 20:33:22 crc kubenswrapper[4802]: I1201 20:33:22.751085 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd889212-5f48-48ab-9f9a-5b028760aa6d" path="/var/lib/kubelet/pods/fd889212-5f48-48ab-9f9a-5b028760aa6d/volumes" Dec 01 20:33:25 crc kubenswrapper[4802]: I1201 20:33:25.970678 4802 scope.go:117] "RemoveContainer" containerID="76d71391080ac6c1aaac302c26c6522a934e42632bf335cadc6272070bb6ef3e" Dec 01 20:33:26 crc kubenswrapper[4802]: I1201 20:33:26.037652 4802 scope.go:117] "RemoveContainer" containerID="0012d1bb733d38e8d278c8063965362705e5efff7b52466a5e7bd185ef65b802" Dec 01 20:33:26 crc kubenswrapper[4802]: I1201 20:33:26.092427 4802 scope.go:117] "RemoveContainer" containerID="7a4a1a0864be80e25e346507f6fb25cb3c4ad99abda57c0c8e9d94811953c300" Dec 01 20:33:26 crc kubenswrapper[4802]: I1201 20:33:26.185083 4802 scope.go:117] "RemoveContainer" containerID="a9e3908a710fd4d9ddd3ffa20e0dcf6be99dd2951b56d29cc36f0da2256a2480" Dec 01 20:33:27 crc kubenswrapper[4802]: I1201 20:33:27.114576 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zm7xk"] Dec 01 20:33:27 crc kubenswrapper[4802]: E1201 20:33:27.115661 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4066f186-9cb5-45aa-b295-dffe32f3aa75" containerName="registry-server" Dec 01 20:33:27 crc kubenswrapper[4802]: I1201 20:33:27.115702 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="4066f186-9cb5-45aa-b295-dffe32f3aa75" containerName="registry-server" Dec 01 20:33:27 crc kubenswrapper[4802]: E1201 20:33:27.115739 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4066f186-9cb5-45aa-b295-dffe32f3aa75" containerName="extract-content" Dec 01 20:33:27 crc kubenswrapper[4802]: I1201 20:33:27.115752 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="4066f186-9cb5-45aa-b295-dffe32f3aa75" containerName="extract-content" Dec 01 20:33:27 crc kubenswrapper[4802]: E1201 20:33:27.115780 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4066f186-9cb5-45aa-b295-dffe32f3aa75" containerName="extract-utilities" Dec 01 20:33:27 crc kubenswrapper[4802]: I1201 20:33:27.115793 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="4066f186-9cb5-45aa-b295-dffe32f3aa75" containerName="extract-utilities" Dec 01 20:33:27 crc kubenswrapper[4802]: I1201 20:33:27.116111 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="4066f186-9cb5-45aa-b295-dffe32f3aa75" containerName="registry-server" Dec 01 20:33:27 crc kubenswrapper[4802]: I1201 20:33:27.118662 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zm7xk" Dec 01 20:33:27 crc kubenswrapper[4802]: I1201 20:33:27.123303 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zm7xk"] Dec 01 20:33:27 crc kubenswrapper[4802]: I1201 20:33:27.218764 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t2r6\" (UniqueName: \"kubernetes.io/projected/0d3eeb11-db75-4a7a-b750-1ff1ebab7485-kube-api-access-6t2r6\") pod \"certified-operators-zm7xk\" (UID: \"0d3eeb11-db75-4a7a-b750-1ff1ebab7485\") " pod="openshift-marketplace/certified-operators-zm7xk" Dec 01 20:33:27 crc kubenswrapper[4802]: I1201 20:33:27.218852 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d3eeb11-db75-4a7a-b750-1ff1ebab7485-utilities\") pod \"certified-operators-zm7xk\" (UID: \"0d3eeb11-db75-4a7a-b750-1ff1ebab7485\") " pod="openshift-marketplace/certified-operators-zm7xk" Dec 01 20:33:27 crc kubenswrapper[4802]: I1201 20:33:27.218887 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d3eeb11-db75-4a7a-b750-1ff1ebab7485-catalog-content\") pod \"certified-operators-zm7xk\" (UID: \"0d3eeb11-db75-4a7a-b750-1ff1ebab7485\") " pod="openshift-marketplace/certified-operators-zm7xk" Dec 01 20:33:27 crc kubenswrapper[4802]: I1201 20:33:27.320649 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t2r6\" (UniqueName: \"kubernetes.io/projected/0d3eeb11-db75-4a7a-b750-1ff1ebab7485-kube-api-access-6t2r6\") pod \"certified-operators-zm7xk\" (UID: \"0d3eeb11-db75-4a7a-b750-1ff1ebab7485\") " pod="openshift-marketplace/certified-operators-zm7xk" Dec 01 20:33:27 crc kubenswrapper[4802]: I1201 20:33:27.320729 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d3eeb11-db75-4a7a-b750-1ff1ebab7485-utilities\") pod \"certified-operators-zm7xk\" (UID: \"0d3eeb11-db75-4a7a-b750-1ff1ebab7485\") " pod="openshift-marketplace/certified-operators-zm7xk" Dec 01 20:33:27 crc kubenswrapper[4802]: I1201 20:33:27.320750 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d3eeb11-db75-4a7a-b750-1ff1ebab7485-catalog-content\") pod \"certified-operators-zm7xk\" (UID: \"0d3eeb11-db75-4a7a-b750-1ff1ebab7485\") " pod="openshift-marketplace/certified-operators-zm7xk" Dec 01 20:33:27 crc kubenswrapper[4802]: I1201 20:33:27.321179 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d3eeb11-db75-4a7a-b750-1ff1ebab7485-catalog-content\") pod \"certified-operators-zm7xk\" (UID: \"0d3eeb11-db75-4a7a-b750-1ff1ebab7485\") " pod="openshift-marketplace/certified-operators-zm7xk" Dec 01 20:33:27 crc kubenswrapper[4802]: I1201 20:33:27.321267 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d3eeb11-db75-4a7a-b750-1ff1ebab7485-utilities\") pod \"certified-operators-zm7xk\" (UID: \"0d3eeb11-db75-4a7a-b750-1ff1ebab7485\") " pod="openshift-marketplace/certified-operators-zm7xk" Dec 01 20:33:27 crc kubenswrapper[4802]: I1201 20:33:27.339444 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t2r6\" (UniqueName: \"kubernetes.io/projected/0d3eeb11-db75-4a7a-b750-1ff1ebab7485-kube-api-access-6t2r6\") pod \"certified-operators-zm7xk\" (UID: \"0d3eeb11-db75-4a7a-b750-1ff1ebab7485\") " pod="openshift-marketplace/certified-operators-zm7xk" Dec 01 20:33:27 crc kubenswrapper[4802]: I1201 20:33:27.441462 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zm7xk" Dec 01 20:33:27 crc kubenswrapper[4802]: I1201 20:33:27.958489 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zm7xk"] Dec 01 20:33:27 crc kubenswrapper[4802]: W1201 20:33:27.964176 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d3eeb11_db75_4a7a_b750_1ff1ebab7485.slice/crio-71033947a687f271116c970b2ae0d7c560fc8288fb5e1aecd330e26a6f57d276 WatchSource:0}: Error finding container 71033947a687f271116c970b2ae0d7c560fc8288fb5e1aecd330e26a6f57d276: Status 404 returned error can't find the container with id 71033947a687f271116c970b2ae0d7c560fc8288fb5e1aecd330e26a6f57d276 Dec 01 20:33:27 crc kubenswrapper[4802]: I1201 20:33:27.999594 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp"] Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.000866 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp" Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.012639 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.012693 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.012857 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.012974 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.013079 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wx8v9" Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.024460 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp"] Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.033786 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5a316ab-c296-4ab8-8397-00e5a017d1cc-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp\" (UID: \"d5a316ab-c296-4ab8-8397-00e5a017d1cc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp" Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.034034 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5a316ab-c296-4ab8-8397-00e5a017d1cc-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp\" (UID: \"d5a316ab-c296-4ab8-8397-00e5a017d1cc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp" Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.034143 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5a316ab-c296-4ab8-8397-00e5a017d1cc-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp\" (UID: \"d5a316ab-c296-4ab8-8397-00e5a017d1cc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp" Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.034291 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtlst\" (UniqueName: \"kubernetes.io/projected/d5a316ab-c296-4ab8-8397-00e5a017d1cc-kube-api-access-jtlst\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp\" (UID: \"d5a316ab-c296-4ab8-8397-00e5a017d1cc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp" Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.034483 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5a316ab-c296-4ab8-8397-00e5a017d1cc-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp\" (UID: \"d5a316ab-c296-4ab8-8397-00e5a017d1cc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp" Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.088139 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.088183 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.088234 4802 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.088987 4802 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88"} pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.089063 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" containerID="cri-o://e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88" gracePeriod=600 Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.135867 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5a316ab-c296-4ab8-8397-00e5a017d1cc-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp\" (UID: \"d5a316ab-c296-4ab8-8397-00e5a017d1cc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp" Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.135948 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5a316ab-c296-4ab8-8397-00e5a017d1cc-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp\" (UID: \"d5a316ab-c296-4ab8-8397-00e5a017d1cc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp" Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.135971 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5a316ab-c296-4ab8-8397-00e5a017d1cc-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp\" (UID: \"d5a316ab-c296-4ab8-8397-00e5a017d1cc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp" Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.135989 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5a316ab-c296-4ab8-8397-00e5a017d1cc-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp\" (UID: \"d5a316ab-c296-4ab8-8397-00e5a017d1cc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp" Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.136026 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtlst\" (UniqueName: \"kubernetes.io/projected/d5a316ab-c296-4ab8-8397-00e5a017d1cc-kube-api-access-jtlst\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp\" (UID: \"d5a316ab-c296-4ab8-8397-00e5a017d1cc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp" Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.143139 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5a316ab-c296-4ab8-8397-00e5a017d1cc-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp\" (UID: \"d5a316ab-c296-4ab8-8397-00e5a017d1cc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp" Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.145223 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5a316ab-c296-4ab8-8397-00e5a017d1cc-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp\" (UID: \"d5a316ab-c296-4ab8-8397-00e5a017d1cc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp" Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.148793 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5a316ab-c296-4ab8-8397-00e5a017d1cc-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp\" (UID: \"d5a316ab-c296-4ab8-8397-00e5a017d1cc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp" Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.149565 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5a316ab-c296-4ab8-8397-00e5a017d1cc-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp\" (UID: \"d5a316ab-c296-4ab8-8397-00e5a017d1cc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp" Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.173071 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtlst\" (UniqueName: \"kubernetes.io/projected/d5a316ab-c296-4ab8-8397-00e5a017d1cc-kube-api-access-jtlst\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp\" (UID: \"d5a316ab-c296-4ab8-8397-00e5a017d1cc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp" Dec 01 20:33:28 crc kubenswrapper[4802]: E1201 20:33:28.225485 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.333084 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp" Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.646350 4802 generic.go:334] "Generic (PLEG): container finished" podID="0d3eeb11-db75-4a7a-b750-1ff1ebab7485" containerID="960a4880811410a40a4222c2fa3f97a5869d788f739ffca73e85a0f97bc6642e" exitCode=0 Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.646402 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zm7xk" event={"ID":"0d3eeb11-db75-4a7a-b750-1ff1ebab7485","Type":"ContainerDied","Data":"960a4880811410a40a4222c2fa3f97a5869d788f739ffca73e85a0f97bc6642e"} Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.646454 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zm7xk" event={"ID":"0d3eeb11-db75-4a7a-b750-1ff1ebab7485","Type":"ContainerStarted","Data":"71033947a687f271116c970b2ae0d7c560fc8288fb5e1aecd330e26a6f57d276"} Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.648649 4802 generic.go:334] "Generic (PLEG): container finished" podID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerID="e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88" exitCode=0 Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.648674 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerDied","Data":"e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88"} Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.648696 4802 scope.go:117] "RemoveContainer" containerID="26cf5483ade0dbc95267728676ce0ccc960449233741d088558ac81f81de5fa8" Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.649838 4802 scope.go:117] "RemoveContainer" containerID="e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88" Dec 01 20:33:28 crc kubenswrapper[4802]: E1201 20:33:28.650951 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:33:28 crc kubenswrapper[4802]: I1201 20:33:28.878541 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp"] Dec 01 20:33:28 crc kubenswrapper[4802]: W1201 20:33:28.879584 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5a316ab_c296_4ab8_8397_00e5a017d1cc.slice/crio-9aca862d8e9b66471e191c20cfb42ec977e80452e2440dd4ca759990d98d2215 WatchSource:0}: Error finding container 9aca862d8e9b66471e191c20cfb42ec977e80452e2440dd4ca759990d98d2215: Status 404 returned error can't find the container with id 9aca862d8e9b66471e191c20cfb42ec977e80452e2440dd4ca759990d98d2215 Dec 01 20:33:29 crc kubenswrapper[4802]: I1201 20:33:29.663852 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp" event={"ID":"d5a316ab-c296-4ab8-8397-00e5a017d1cc","Type":"ContainerStarted","Data":"9aca862d8e9b66471e191c20cfb42ec977e80452e2440dd4ca759990d98d2215"} Dec 01 20:33:30 crc kubenswrapper[4802]: I1201 20:33:30.681433 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zm7xk" event={"ID":"0d3eeb11-db75-4a7a-b750-1ff1ebab7485","Type":"ContainerStarted","Data":"70ed1b789b5340557adb69b2947c5c6933eb062421801fe1b95806d9ac9b2e91"} Dec 01 20:33:30 crc kubenswrapper[4802]: I1201 20:33:30.688830 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp" event={"ID":"d5a316ab-c296-4ab8-8397-00e5a017d1cc","Type":"ContainerStarted","Data":"b7f903ab37e8c324b72962b929179c329f664c4b1537c76abb9775bcb67b9031"} Dec 01 20:33:30 crc kubenswrapper[4802]: I1201 20:33:30.733622 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp" podStartSLOduration=2.54773898 podStartE2EDuration="3.73358868s" podCreationTimestamp="2025-12-01 20:33:27 +0000 UTC" firstStartedPulling="2025-12-01 20:33:28.881862301 +0000 UTC m=+2230.444421942" lastFinishedPulling="2025-12-01 20:33:30.067712001 +0000 UTC m=+2231.630271642" observedRunningTime="2025-12-01 20:33:30.725383693 +0000 UTC m=+2232.287943414" watchObservedRunningTime="2025-12-01 20:33:30.73358868 +0000 UTC m=+2232.296148391" Dec 01 20:33:31 crc kubenswrapper[4802]: I1201 20:33:31.698924 4802 generic.go:334] "Generic (PLEG): container finished" podID="0d3eeb11-db75-4a7a-b750-1ff1ebab7485" containerID="70ed1b789b5340557adb69b2947c5c6933eb062421801fe1b95806d9ac9b2e91" exitCode=0 Dec 01 20:33:31 crc kubenswrapper[4802]: I1201 20:33:31.699183 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zm7xk" event={"ID":"0d3eeb11-db75-4a7a-b750-1ff1ebab7485","Type":"ContainerDied","Data":"70ed1b789b5340557adb69b2947c5c6933eb062421801fe1b95806d9ac9b2e91"} Dec 01 20:33:32 crc kubenswrapper[4802]: I1201 20:33:32.711794 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zm7xk" event={"ID":"0d3eeb11-db75-4a7a-b750-1ff1ebab7485","Type":"ContainerStarted","Data":"00310771d56934ed32a409ccbabaf2422604d9053b94e631459732109a5a18bc"} Dec 01 20:33:32 crc kubenswrapper[4802]: I1201 20:33:32.736549 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zm7xk" podStartSLOduration=2.234689674 podStartE2EDuration="5.736525955s" podCreationTimestamp="2025-12-01 20:33:27 +0000 UTC" firstStartedPulling="2025-12-01 20:33:28.64814488 +0000 UTC m=+2230.210704521" lastFinishedPulling="2025-12-01 20:33:32.149981161 +0000 UTC m=+2233.712540802" observedRunningTime="2025-12-01 20:33:32.728307658 +0000 UTC m=+2234.290867319" watchObservedRunningTime="2025-12-01 20:33:32.736525955 +0000 UTC m=+2234.299085596" Dec 01 20:33:37 crc kubenswrapper[4802]: I1201 20:33:37.441863 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zm7xk" Dec 01 20:33:37 crc kubenswrapper[4802]: I1201 20:33:37.442553 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zm7xk" Dec 01 20:33:37 crc kubenswrapper[4802]: I1201 20:33:37.483780 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zm7xk" Dec 01 20:33:37 crc kubenswrapper[4802]: I1201 20:33:37.801742 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zm7xk" Dec 01 20:33:37 crc kubenswrapper[4802]: I1201 20:33:37.862225 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zm7xk"] Dec 01 20:33:39 crc kubenswrapper[4802]: I1201 20:33:39.784627 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zm7xk" podUID="0d3eeb11-db75-4a7a-b750-1ff1ebab7485" containerName="registry-server" containerID="cri-o://00310771d56934ed32a409ccbabaf2422604d9053b94e631459732109a5a18bc" gracePeriod=2 Dec 01 20:33:40 crc kubenswrapper[4802]: I1201 20:33:40.777921 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zm7xk" Dec 01 20:33:40 crc kubenswrapper[4802]: I1201 20:33:40.795099 4802 generic.go:334] "Generic (PLEG): container finished" podID="0d3eeb11-db75-4a7a-b750-1ff1ebab7485" containerID="00310771d56934ed32a409ccbabaf2422604d9053b94e631459732109a5a18bc" exitCode=0 Dec 01 20:33:40 crc kubenswrapper[4802]: I1201 20:33:40.795262 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zm7xk" event={"ID":"0d3eeb11-db75-4a7a-b750-1ff1ebab7485","Type":"ContainerDied","Data":"00310771d56934ed32a409ccbabaf2422604d9053b94e631459732109a5a18bc"} Dec 01 20:33:40 crc kubenswrapper[4802]: I1201 20:33:40.796307 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zm7xk" event={"ID":"0d3eeb11-db75-4a7a-b750-1ff1ebab7485","Type":"ContainerDied","Data":"71033947a687f271116c970b2ae0d7c560fc8288fb5e1aecd330e26a6f57d276"} Dec 01 20:33:40 crc kubenswrapper[4802]: I1201 20:33:40.795332 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zm7xk" Dec 01 20:33:40 crc kubenswrapper[4802]: I1201 20:33:40.796342 4802 scope.go:117] "RemoveContainer" containerID="00310771d56934ed32a409ccbabaf2422604d9053b94e631459732109a5a18bc" Dec 01 20:33:40 crc kubenswrapper[4802]: I1201 20:33:40.824507 4802 scope.go:117] "RemoveContainer" containerID="70ed1b789b5340557adb69b2947c5c6933eb062421801fe1b95806d9ac9b2e91" Dec 01 20:33:40 crc kubenswrapper[4802]: I1201 20:33:40.845405 4802 scope.go:117] "RemoveContainer" containerID="960a4880811410a40a4222c2fa3f97a5869d788f739ffca73e85a0f97bc6642e" Dec 01 20:33:40 crc kubenswrapper[4802]: I1201 20:33:40.876339 4802 scope.go:117] "RemoveContainer" containerID="00310771d56934ed32a409ccbabaf2422604d9053b94e631459732109a5a18bc" Dec 01 20:33:40 crc kubenswrapper[4802]: E1201 20:33:40.876759 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00310771d56934ed32a409ccbabaf2422604d9053b94e631459732109a5a18bc\": container with ID starting with 00310771d56934ed32a409ccbabaf2422604d9053b94e631459732109a5a18bc not found: ID does not exist" containerID="00310771d56934ed32a409ccbabaf2422604d9053b94e631459732109a5a18bc" Dec 01 20:33:40 crc kubenswrapper[4802]: I1201 20:33:40.876800 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00310771d56934ed32a409ccbabaf2422604d9053b94e631459732109a5a18bc"} err="failed to get container status \"00310771d56934ed32a409ccbabaf2422604d9053b94e631459732109a5a18bc\": rpc error: code = NotFound desc = could not find container \"00310771d56934ed32a409ccbabaf2422604d9053b94e631459732109a5a18bc\": container with ID starting with 00310771d56934ed32a409ccbabaf2422604d9053b94e631459732109a5a18bc not found: ID does not exist" Dec 01 20:33:40 crc kubenswrapper[4802]: I1201 20:33:40.876829 4802 scope.go:117] "RemoveContainer" containerID="70ed1b789b5340557adb69b2947c5c6933eb062421801fe1b95806d9ac9b2e91" Dec 01 20:33:40 crc kubenswrapper[4802]: E1201 20:33:40.877372 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70ed1b789b5340557adb69b2947c5c6933eb062421801fe1b95806d9ac9b2e91\": container with ID starting with 70ed1b789b5340557adb69b2947c5c6933eb062421801fe1b95806d9ac9b2e91 not found: ID does not exist" containerID="70ed1b789b5340557adb69b2947c5c6933eb062421801fe1b95806d9ac9b2e91" Dec 01 20:33:40 crc kubenswrapper[4802]: I1201 20:33:40.877410 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70ed1b789b5340557adb69b2947c5c6933eb062421801fe1b95806d9ac9b2e91"} err="failed to get container status \"70ed1b789b5340557adb69b2947c5c6933eb062421801fe1b95806d9ac9b2e91\": rpc error: code = NotFound desc = could not find container \"70ed1b789b5340557adb69b2947c5c6933eb062421801fe1b95806d9ac9b2e91\": container with ID starting with 70ed1b789b5340557adb69b2947c5c6933eb062421801fe1b95806d9ac9b2e91 not found: ID does not exist" Dec 01 20:33:40 crc kubenswrapper[4802]: I1201 20:33:40.877431 4802 scope.go:117] "RemoveContainer" containerID="960a4880811410a40a4222c2fa3f97a5869d788f739ffca73e85a0f97bc6642e" Dec 01 20:33:40 crc kubenswrapper[4802]: E1201 20:33:40.877638 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"960a4880811410a40a4222c2fa3f97a5869d788f739ffca73e85a0f97bc6642e\": container with ID starting with 960a4880811410a40a4222c2fa3f97a5869d788f739ffca73e85a0f97bc6642e not found: ID does not exist" containerID="960a4880811410a40a4222c2fa3f97a5869d788f739ffca73e85a0f97bc6642e" Dec 01 20:33:40 crc kubenswrapper[4802]: I1201 20:33:40.877668 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"960a4880811410a40a4222c2fa3f97a5869d788f739ffca73e85a0f97bc6642e"} err="failed to get container status \"960a4880811410a40a4222c2fa3f97a5869d788f739ffca73e85a0f97bc6642e\": rpc error: code = NotFound desc = could not find container \"960a4880811410a40a4222c2fa3f97a5869d788f739ffca73e85a0f97bc6642e\": container with ID starting with 960a4880811410a40a4222c2fa3f97a5869d788f739ffca73e85a0f97bc6642e not found: ID does not exist" Dec 01 20:33:40 crc kubenswrapper[4802]: I1201 20:33:40.922276 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6t2r6\" (UniqueName: \"kubernetes.io/projected/0d3eeb11-db75-4a7a-b750-1ff1ebab7485-kube-api-access-6t2r6\") pod \"0d3eeb11-db75-4a7a-b750-1ff1ebab7485\" (UID: \"0d3eeb11-db75-4a7a-b750-1ff1ebab7485\") " Dec 01 20:33:40 crc kubenswrapper[4802]: I1201 20:33:40.922350 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d3eeb11-db75-4a7a-b750-1ff1ebab7485-catalog-content\") pod \"0d3eeb11-db75-4a7a-b750-1ff1ebab7485\" (UID: \"0d3eeb11-db75-4a7a-b750-1ff1ebab7485\") " Dec 01 20:33:40 crc kubenswrapper[4802]: I1201 20:33:40.922397 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d3eeb11-db75-4a7a-b750-1ff1ebab7485-utilities\") pod \"0d3eeb11-db75-4a7a-b750-1ff1ebab7485\" (UID: \"0d3eeb11-db75-4a7a-b750-1ff1ebab7485\") " Dec 01 20:33:40 crc kubenswrapper[4802]: I1201 20:33:40.923779 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d3eeb11-db75-4a7a-b750-1ff1ebab7485-utilities" (OuterVolumeSpecName: "utilities") pod "0d3eeb11-db75-4a7a-b750-1ff1ebab7485" (UID: "0d3eeb11-db75-4a7a-b750-1ff1ebab7485"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:33:40 crc kubenswrapper[4802]: I1201 20:33:40.927866 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d3eeb11-db75-4a7a-b750-1ff1ebab7485-kube-api-access-6t2r6" (OuterVolumeSpecName: "kube-api-access-6t2r6") pod "0d3eeb11-db75-4a7a-b750-1ff1ebab7485" (UID: "0d3eeb11-db75-4a7a-b750-1ff1ebab7485"). InnerVolumeSpecName "kube-api-access-6t2r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:33:41 crc kubenswrapper[4802]: I1201 20:33:41.023984 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d3eeb11-db75-4a7a-b750-1ff1ebab7485-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 20:33:41 crc kubenswrapper[4802]: I1201 20:33:41.024019 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6t2r6\" (UniqueName: \"kubernetes.io/projected/0d3eeb11-db75-4a7a-b750-1ff1ebab7485-kube-api-access-6t2r6\") on node \"crc\" DevicePath \"\"" Dec 01 20:33:41 crc kubenswrapper[4802]: I1201 20:33:41.229241 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d3eeb11-db75-4a7a-b750-1ff1ebab7485-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d3eeb11-db75-4a7a-b750-1ff1ebab7485" (UID: "0d3eeb11-db75-4a7a-b750-1ff1ebab7485"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:33:41 crc kubenswrapper[4802]: I1201 20:33:41.331099 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d3eeb11-db75-4a7a-b750-1ff1ebab7485-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 20:33:41 crc kubenswrapper[4802]: I1201 20:33:41.429422 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zm7xk"] Dec 01 20:33:41 crc kubenswrapper[4802]: I1201 20:33:41.438677 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zm7xk"] Dec 01 20:33:41 crc kubenswrapper[4802]: I1201 20:33:41.720827 4802 scope.go:117] "RemoveContainer" containerID="e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88" Dec 01 20:33:41 crc kubenswrapper[4802]: E1201 20:33:41.721302 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:33:42 crc kubenswrapper[4802]: I1201 20:33:42.733396 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d3eeb11-db75-4a7a-b750-1ff1ebab7485" path="/var/lib/kubelet/pods/0d3eeb11-db75-4a7a-b750-1ff1ebab7485/volumes" Dec 01 20:33:44 crc kubenswrapper[4802]: I1201 20:33:44.855038 4802 generic.go:334] "Generic (PLEG): container finished" podID="d5a316ab-c296-4ab8-8397-00e5a017d1cc" containerID="b7f903ab37e8c324b72962b929179c329f664c4b1537c76abb9775bcb67b9031" exitCode=0 Dec 01 20:33:44 crc kubenswrapper[4802]: I1201 20:33:44.855393 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp" event={"ID":"d5a316ab-c296-4ab8-8397-00e5a017d1cc","Type":"ContainerDied","Data":"b7f903ab37e8c324b72962b929179c329f664c4b1537c76abb9775bcb67b9031"} Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.226687 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp" Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.427955 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5a316ab-c296-4ab8-8397-00e5a017d1cc-repo-setup-combined-ca-bundle\") pod \"d5a316ab-c296-4ab8-8397-00e5a017d1cc\" (UID: \"d5a316ab-c296-4ab8-8397-00e5a017d1cc\") " Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.428122 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5a316ab-c296-4ab8-8397-00e5a017d1cc-ceph\") pod \"d5a316ab-c296-4ab8-8397-00e5a017d1cc\" (UID: \"d5a316ab-c296-4ab8-8397-00e5a017d1cc\") " Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.428210 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtlst\" (UniqueName: \"kubernetes.io/projected/d5a316ab-c296-4ab8-8397-00e5a017d1cc-kube-api-access-jtlst\") pod \"d5a316ab-c296-4ab8-8397-00e5a017d1cc\" (UID: \"d5a316ab-c296-4ab8-8397-00e5a017d1cc\") " Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.428323 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5a316ab-c296-4ab8-8397-00e5a017d1cc-ssh-key\") pod \"d5a316ab-c296-4ab8-8397-00e5a017d1cc\" (UID: \"d5a316ab-c296-4ab8-8397-00e5a017d1cc\") " Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.428415 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5a316ab-c296-4ab8-8397-00e5a017d1cc-inventory\") pod \"d5a316ab-c296-4ab8-8397-00e5a017d1cc\" (UID: \"d5a316ab-c296-4ab8-8397-00e5a017d1cc\") " Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.433258 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5a316ab-c296-4ab8-8397-00e5a017d1cc-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "d5a316ab-c296-4ab8-8397-00e5a017d1cc" (UID: "d5a316ab-c296-4ab8-8397-00e5a017d1cc"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.435185 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5a316ab-c296-4ab8-8397-00e5a017d1cc-ceph" (OuterVolumeSpecName: "ceph") pod "d5a316ab-c296-4ab8-8397-00e5a017d1cc" (UID: "d5a316ab-c296-4ab8-8397-00e5a017d1cc"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.445475 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5a316ab-c296-4ab8-8397-00e5a017d1cc-kube-api-access-jtlst" (OuterVolumeSpecName: "kube-api-access-jtlst") pod "d5a316ab-c296-4ab8-8397-00e5a017d1cc" (UID: "d5a316ab-c296-4ab8-8397-00e5a017d1cc"). InnerVolumeSpecName "kube-api-access-jtlst". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.453506 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5a316ab-c296-4ab8-8397-00e5a017d1cc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d5a316ab-c296-4ab8-8397-00e5a017d1cc" (UID: "d5a316ab-c296-4ab8-8397-00e5a017d1cc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.454406 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5a316ab-c296-4ab8-8397-00e5a017d1cc-inventory" (OuterVolumeSpecName: "inventory") pod "d5a316ab-c296-4ab8-8397-00e5a017d1cc" (UID: "d5a316ab-c296-4ab8-8397-00e5a017d1cc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.530726 4802 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d5a316ab-c296-4ab8-8397-00e5a017d1cc-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.530756 4802 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5a316ab-c296-4ab8-8397-00e5a017d1cc-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.530771 4802 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d5a316ab-c296-4ab8-8397-00e5a017d1cc-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.530786 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtlst\" (UniqueName: \"kubernetes.io/projected/d5a316ab-c296-4ab8-8397-00e5a017d1cc-kube-api-access-jtlst\") on node \"crc\" DevicePath \"\"" Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.530797 4802 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d5a316ab-c296-4ab8-8397-00e5a017d1cc-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.871125 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp" event={"ID":"d5a316ab-c296-4ab8-8397-00e5a017d1cc","Type":"ContainerDied","Data":"9aca862d8e9b66471e191c20cfb42ec977e80452e2440dd4ca759990d98d2215"} Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.871164 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9aca862d8e9b66471e191c20cfb42ec977e80452e2440dd4ca759990d98d2215" Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.871261 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp" Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.962184 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5"] Dec 01 20:33:46 crc kubenswrapper[4802]: E1201 20:33:46.962737 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d3eeb11-db75-4a7a-b750-1ff1ebab7485" containerName="registry-server" Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.962767 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3eeb11-db75-4a7a-b750-1ff1ebab7485" containerName="registry-server" Dec 01 20:33:46 crc kubenswrapper[4802]: E1201 20:33:46.962792 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d3eeb11-db75-4a7a-b750-1ff1ebab7485" containerName="extract-utilities" Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.962801 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3eeb11-db75-4a7a-b750-1ff1ebab7485" containerName="extract-utilities" Dec 01 20:33:46 crc kubenswrapper[4802]: E1201 20:33:46.962840 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d3eeb11-db75-4a7a-b750-1ff1ebab7485" containerName="extract-content" Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.962848 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3eeb11-db75-4a7a-b750-1ff1ebab7485" containerName="extract-content" Dec 01 20:33:46 crc kubenswrapper[4802]: E1201 20:33:46.962870 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a316ab-c296-4ab8-8397-00e5a017d1cc" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.962880 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a316ab-c296-4ab8-8397-00e5a017d1cc" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.963087 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5a316ab-c296-4ab8-8397-00e5a017d1cc" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.963123 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d3eeb11-db75-4a7a-b750-1ff1ebab7485" containerName="registry-server" Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.963908 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5" Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.966961 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.966978 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.969177 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wx8v9" Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.971827 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.972013 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 20:33:46 crc kubenswrapper[4802]: I1201 20:33:46.980226 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5"] Dec 01 20:33:47 crc kubenswrapper[4802]: I1201 20:33:47.140653 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e04c0b98-f144-4917-be97-11a6b8f2b449-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5\" (UID: \"e04c0b98-f144-4917-be97-11a6b8f2b449\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5" Dec 01 20:33:47 crc kubenswrapper[4802]: I1201 20:33:47.140777 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e04c0b98-f144-4917-be97-11a6b8f2b449-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5\" (UID: \"e04c0b98-f144-4917-be97-11a6b8f2b449\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5" Dec 01 20:33:47 crc kubenswrapper[4802]: I1201 20:33:47.140839 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e04c0b98-f144-4917-be97-11a6b8f2b449-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5\" (UID: \"e04c0b98-f144-4917-be97-11a6b8f2b449\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5" Dec 01 20:33:47 crc kubenswrapper[4802]: I1201 20:33:47.140876 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv89b\" (UniqueName: \"kubernetes.io/projected/e04c0b98-f144-4917-be97-11a6b8f2b449-kube-api-access-vv89b\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5\" (UID: \"e04c0b98-f144-4917-be97-11a6b8f2b449\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5" Dec 01 20:33:47 crc kubenswrapper[4802]: I1201 20:33:47.140957 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e04c0b98-f144-4917-be97-11a6b8f2b449-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5\" (UID: \"e04c0b98-f144-4917-be97-11a6b8f2b449\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5" Dec 01 20:33:47 crc kubenswrapper[4802]: I1201 20:33:47.242985 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e04c0b98-f144-4917-be97-11a6b8f2b449-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5\" (UID: \"e04c0b98-f144-4917-be97-11a6b8f2b449\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5" Dec 01 20:33:47 crc kubenswrapper[4802]: I1201 20:33:47.243271 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e04c0b98-f144-4917-be97-11a6b8f2b449-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5\" (UID: \"e04c0b98-f144-4917-be97-11a6b8f2b449\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5" Dec 01 20:33:47 crc kubenswrapper[4802]: I1201 20:33:47.243321 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e04c0b98-f144-4917-be97-11a6b8f2b449-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5\" (UID: \"e04c0b98-f144-4917-be97-11a6b8f2b449\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5" Dec 01 20:33:47 crc kubenswrapper[4802]: I1201 20:33:47.243351 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e04c0b98-f144-4917-be97-11a6b8f2b449-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5\" (UID: \"e04c0b98-f144-4917-be97-11a6b8f2b449\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5" Dec 01 20:33:47 crc kubenswrapper[4802]: I1201 20:33:47.243372 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv89b\" (UniqueName: \"kubernetes.io/projected/e04c0b98-f144-4917-be97-11a6b8f2b449-kube-api-access-vv89b\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5\" (UID: \"e04c0b98-f144-4917-be97-11a6b8f2b449\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5" Dec 01 20:33:47 crc kubenswrapper[4802]: I1201 20:33:47.247320 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e04c0b98-f144-4917-be97-11a6b8f2b449-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5\" (UID: \"e04c0b98-f144-4917-be97-11a6b8f2b449\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5" Dec 01 20:33:47 crc kubenswrapper[4802]: I1201 20:33:47.247396 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e04c0b98-f144-4917-be97-11a6b8f2b449-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5\" (UID: \"e04c0b98-f144-4917-be97-11a6b8f2b449\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5" Dec 01 20:33:47 crc kubenswrapper[4802]: I1201 20:33:47.247521 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e04c0b98-f144-4917-be97-11a6b8f2b449-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5\" (UID: \"e04c0b98-f144-4917-be97-11a6b8f2b449\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5" Dec 01 20:33:47 crc kubenswrapper[4802]: I1201 20:33:47.248186 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e04c0b98-f144-4917-be97-11a6b8f2b449-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5\" (UID: \"e04c0b98-f144-4917-be97-11a6b8f2b449\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5" Dec 01 20:33:47 crc kubenswrapper[4802]: I1201 20:33:47.261437 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv89b\" (UniqueName: \"kubernetes.io/projected/e04c0b98-f144-4917-be97-11a6b8f2b449-kube-api-access-vv89b\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5\" (UID: \"e04c0b98-f144-4917-be97-11a6b8f2b449\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5" Dec 01 20:33:47 crc kubenswrapper[4802]: I1201 20:33:47.289380 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5" Dec 01 20:33:47 crc kubenswrapper[4802]: I1201 20:33:47.795999 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5"] Dec 01 20:33:47 crc kubenswrapper[4802]: I1201 20:33:47.879662 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5" event={"ID":"e04c0b98-f144-4917-be97-11a6b8f2b449","Type":"ContainerStarted","Data":"c4f5a27e889949163e7177582f9a442ce70c46d112a9508be9a50d2fb6e47b1e"} Dec 01 20:33:48 crc kubenswrapper[4802]: I1201 20:33:48.888697 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5" event={"ID":"e04c0b98-f144-4917-be97-11a6b8f2b449","Type":"ContainerStarted","Data":"2e2c99a9c81b2700c2b1940d1dd7d0cfbd389e264fd3508eaeb2f192053f8f29"} Dec 01 20:33:48 crc kubenswrapper[4802]: I1201 20:33:48.908332 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5" podStartSLOduration=2.307937784 podStartE2EDuration="2.908310992s" podCreationTimestamp="2025-12-01 20:33:46 +0000 UTC" firstStartedPulling="2025-12-01 20:33:47.803899014 +0000 UTC m=+2249.366458655" lastFinishedPulling="2025-12-01 20:33:48.404272222 +0000 UTC m=+2249.966831863" observedRunningTime="2025-12-01 20:33:48.904980048 +0000 UTC m=+2250.467539719" watchObservedRunningTime="2025-12-01 20:33:48.908310992 +0000 UTC m=+2250.470870633" Dec 01 20:33:55 crc kubenswrapper[4802]: I1201 20:33:55.721427 4802 scope.go:117] "RemoveContainer" containerID="e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88" Dec 01 20:33:55 crc kubenswrapper[4802]: E1201 20:33:55.722965 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:34:07 crc kubenswrapper[4802]: I1201 20:34:07.720005 4802 scope.go:117] "RemoveContainer" containerID="e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88" Dec 01 20:34:07 crc kubenswrapper[4802]: E1201 20:34:07.720888 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:34:21 crc kubenswrapper[4802]: I1201 20:34:21.720519 4802 scope.go:117] "RemoveContainer" containerID="e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88" Dec 01 20:34:21 crc kubenswrapper[4802]: E1201 20:34:21.721696 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:34:26 crc kubenswrapper[4802]: I1201 20:34:26.286132 4802 scope.go:117] "RemoveContainer" containerID="31d5ccfcebe6d7948a392f0c8d4d3481f47c32e1c8d108915b2c0c1c3856e5a4" Dec 01 20:34:26 crc kubenswrapper[4802]: I1201 20:34:26.320748 4802 scope.go:117] "RemoveContainer" containerID="1fcb25fc29970b4c0d3998d519cf89e5f05646e4403fe83c821708fab03bfb6b" Dec 01 20:34:26 crc kubenswrapper[4802]: I1201 20:34:26.399817 4802 scope.go:117] "RemoveContainer" containerID="b6f49da08bb316904fe562afe37558520b5fd287779804dca36b434fed9a5db5" Dec 01 20:34:36 crc kubenswrapper[4802]: I1201 20:34:36.720451 4802 scope.go:117] "RemoveContainer" containerID="e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88" Dec 01 20:34:36 crc kubenswrapper[4802]: E1201 20:34:36.735455 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:34:47 crc kubenswrapper[4802]: I1201 20:34:47.720477 4802 scope.go:117] "RemoveContainer" containerID="e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88" Dec 01 20:34:47 crc kubenswrapper[4802]: E1201 20:34:47.722788 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:35:02 crc kubenswrapper[4802]: I1201 20:35:02.720479 4802 scope.go:117] "RemoveContainer" containerID="e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88" Dec 01 20:35:02 crc kubenswrapper[4802]: E1201 20:35:02.721459 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:35:17 crc kubenswrapper[4802]: I1201 20:35:17.721761 4802 scope.go:117] "RemoveContainer" containerID="e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88" Dec 01 20:35:17 crc kubenswrapper[4802]: E1201 20:35:17.723118 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:35:26 crc kubenswrapper[4802]: I1201 20:35:26.534722 4802 scope.go:117] "RemoveContainer" containerID="ad4d84e14b1fc7d09b7a8f0a0a47e21569efc3fd5b67c97d7df28751efef04b1" Dec 01 20:35:29 crc kubenswrapper[4802]: I1201 20:35:29.720052 4802 scope.go:117] "RemoveContainer" containerID="e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88" Dec 01 20:35:29 crc kubenswrapper[4802]: E1201 20:35:29.722119 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:35:40 crc kubenswrapper[4802]: I1201 20:35:40.720562 4802 scope.go:117] "RemoveContainer" containerID="e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88" Dec 01 20:35:40 crc kubenswrapper[4802]: E1201 20:35:40.721338 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:35:54 crc kubenswrapper[4802]: I1201 20:35:54.720861 4802 scope.go:117] "RemoveContainer" containerID="e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88" Dec 01 20:35:54 crc kubenswrapper[4802]: E1201 20:35:54.721752 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:35:57 crc kubenswrapper[4802]: I1201 20:35:57.648375 4802 generic.go:334] "Generic (PLEG): container finished" podID="e04c0b98-f144-4917-be97-11a6b8f2b449" containerID="2e2c99a9c81b2700c2b1940d1dd7d0cfbd389e264fd3508eaeb2f192053f8f29" exitCode=0 Dec 01 20:35:57 crc kubenswrapper[4802]: I1201 20:35:57.648503 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5" event={"ID":"e04c0b98-f144-4917-be97-11a6b8f2b449","Type":"ContainerDied","Data":"2e2c99a9c81b2700c2b1940d1dd7d0cfbd389e264fd3508eaeb2f192053f8f29"} Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.091924 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.117277 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e04c0b98-f144-4917-be97-11a6b8f2b449-ssh-key\") pod \"e04c0b98-f144-4917-be97-11a6b8f2b449\" (UID: \"e04c0b98-f144-4917-be97-11a6b8f2b449\") " Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.117370 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e04c0b98-f144-4917-be97-11a6b8f2b449-bootstrap-combined-ca-bundle\") pod \"e04c0b98-f144-4917-be97-11a6b8f2b449\" (UID: \"e04c0b98-f144-4917-be97-11a6b8f2b449\") " Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.117434 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e04c0b98-f144-4917-be97-11a6b8f2b449-inventory\") pod \"e04c0b98-f144-4917-be97-11a6b8f2b449\" (UID: \"e04c0b98-f144-4917-be97-11a6b8f2b449\") " Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.117509 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv89b\" (UniqueName: \"kubernetes.io/projected/e04c0b98-f144-4917-be97-11a6b8f2b449-kube-api-access-vv89b\") pod \"e04c0b98-f144-4917-be97-11a6b8f2b449\" (UID: \"e04c0b98-f144-4917-be97-11a6b8f2b449\") " Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.117529 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e04c0b98-f144-4917-be97-11a6b8f2b449-ceph\") pod \"e04c0b98-f144-4917-be97-11a6b8f2b449\" (UID: \"e04c0b98-f144-4917-be97-11a6b8f2b449\") " Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.132222 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e04c0b98-f144-4917-be97-11a6b8f2b449-ceph" (OuterVolumeSpecName: "ceph") pod "e04c0b98-f144-4917-be97-11a6b8f2b449" (UID: "e04c0b98-f144-4917-be97-11a6b8f2b449"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.132306 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e04c0b98-f144-4917-be97-11a6b8f2b449-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "e04c0b98-f144-4917-be97-11a6b8f2b449" (UID: "e04c0b98-f144-4917-be97-11a6b8f2b449"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.132341 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e04c0b98-f144-4917-be97-11a6b8f2b449-kube-api-access-vv89b" (OuterVolumeSpecName: "kube-api-access-vv89b") pod "e04c0b98-f144-4917-be97-11a6b8f2b449" (UID: "e04c0b98-f144-4917-be97-11a6b8f2b449"). InnerVolumeSpecName "kube-api-access-vv89b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.142958 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e04c0b98-f144-4917-be97-11a6b8f2b449-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e04c0b98-f144-4917-be97-11a6b8f2b449" (UID: "e04c0b98-f144-4917-be97-11a6b8f2b449"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.148754 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e04c0b98-f144-4917-be97-11a6b8f2b449-inventory" (OuterVolumeSpecName: "inventory") pod "e04c0b98-f144-4917-be97-11a6b8f2b449" (UID: "e04c0b98-f144-4917-be97-11a6b8f2b449"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.219912 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv89b\" (UniqueName: \"kubernetes.io/projected/e04c0b98-f144-4917-be97-11a6b8f2b449-kube-api-access-vv89b\") on node \"crc\" DevicePath \"\"" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.219960 4802 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e04c0b98-f144-4917-be97-11a6b8f2b449-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.219987 4802 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e04c0b98-f144-4917-be97-11a6b8f2b449-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.220004 4802 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e04c0b98-f144-4917-be97-11a6b8f2b449-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.220019 4802 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e04c0b98-f144-4917-be97-11a6b8f2b449-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.668265 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5" event={"ID":"e04c0b98-f144-4917-be97-11a6b8f2b449","Type":"ContainerDied","Data":"c4f5a27e889949163e7177582f9a442ce70c46d112a9508be9a50d2fb6e47b1e"} Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.668318 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4f5a27e889949163e7177582f9a442ce70c46d112a9508be9a50d2fb6e47b1e" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.668332 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.791522 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g62m8"] Dec 01 20:35:59 crc kubenswrapper[4802]: E1201 20:35:59.792239 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e04c0b98-f144-4917-be97-11a6b8f2b449" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.792261 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="e04c0b98-f144-4917-be97-11a6b8f2b449" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.792422 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="e04c0b98-f144-4917-be97-11a6b8f2b449" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.793006 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g62m8" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.796379 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wx8v9" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.796380 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.796728 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.796787 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.796966 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.799188 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g62m8"] Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.832208 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3a154c18-5d93-4d73-9e97-90fb21082eea-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-g62m8\" (UID: \"3a154c18-5d93-4d73-9e97-90fb21082eea\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g62m8" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.832266 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fshh9\" (UniqueName: \"kubernetes.io/projected/3a154c18-5d93-4d73-9e97-90fb21082eea-kube-api-access-fshh9\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-g62m8\" (UID: \"3a154c18-5d93-4d73-9e97-90fb21082eea\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g62m8" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.832318 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a154c18-5d93-4d73-9e97-90fb21082eea-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-g62m8\" (UID: \"3a154c18-5d93-4d73-9e97-90fb21082eea\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g62m8" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.832417 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3a154c18-5d93-4d73-9e97-90fb21082eea-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-g62m8\" (UID: \"3a154c18-5d93-4d73-9e97-90fb21082eea\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g62m8" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.933704 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fshh9\" (UniqueName: \"kubernetes.io/projected/3a154c18-5d93-4d73-9e97-90fb21082eea-kube-api-access-fshh9\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-g62m8\" (UID: \"3a154c18-5d93-4d73-9e97-90fb21082eea\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g62m8" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.933780 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a154c18-5d93-4d73-9e97-90fb21082eea-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-g62m8\" (UID: \"3a154c18-5d93-4d73-9e97-90fb21082eea\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g62m8" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.933892 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3a154c18-5d93-4d73-9e97-90fb21082eea-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-g62m8\" (UID: \"3a154c18-5d93-4d73-9e97-90fb21082eea\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g62m8" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.933937 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3a154c18-5d93-4d73-9e97-90fb21082eea-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-g62m8\" (UID: \"3a154c18-5d93-4d73-9e97-90fb21082eea\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g62m8" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.938403 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a154c18-5d93-4d73-9e97-90fb21082eea-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-g62m8\" (UID: \"3a154c18-5d93-4d73-9e97-90fb21082eea\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g62m8" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.940509 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3a154c18-5d93-4d73-9e97-90fb21082eea-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-g62m8\" (UID: \"3a154c18-5d93-4d73-9e97-90fb21082eea\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g62m8" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.941055 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3a154c18-5d93-4d73-9e97-90fb21082eea-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-g62m8\" (UID: \"3a154c18-5d93-4d73-9e97-90fb21082eea\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g62m8" Dec 01 20:35:59 crc kubenswrapper[4802]: I1201 20:35:59.951570 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fshh9\" (UniqueName: \"kubernetes.io/projected/3a154c18-5d93-4d73-9e97-90fb21082eea-kube-api-access-fshh9\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-g62m8\" (UID: \"3a154c18-5d93-4d73-9e97-90fb21082eea\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g62m8" Dec 01 20:36:00 crc kubenswrapper[4802]: I1201 20:36:00.116051 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g62m8" Dec 01 20:36:00 crc kubenswrapper[4802]: I1201 20:36:00.607829 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g62m8"] Dec 01 20:36:00 crc kubenswrapper[4802]: I1201 20:36:00.677251 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g62m8" event={"ID":"3a154c18-5d93-4d73-9e97-90fb21082eea","Type":"ContainerStarted","Data":"4b7d194e495e333b144109eecbcc85f4b0722c4c8656ef4e15db768900d48252"} Dec 01 20:36:01 crc kubenswrapper[4802]: I1201 20:36:01.685749 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g62m8" event={"ID":"3a154c18-5d93-4d73-9e97-90fb21082eea","Type":"ContainerStarted","Data":"e31e6e2b3fecef3a16be08a101d7a386f6638070ba7a4c6c188e9f9b58a418f0"} Dec 01 20:36:01 crc kubenswrapper[4802]: I1201 20:36:01.714444 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g62m8" podStartSLOduration=2.189269237 podStartE2EDuration="2.714424862s" podCreationTimestamp="2025-12-01 20:35:59 +0000 UTC" firstStartedPulling="2025-12-01 20:36:00.610189338 +0000 UTC m=+2382.172748979" lastFinishedPulling="2025-12-01 20:36:01.135344933 +0000 UTC m=+2382.697904604" observedRunningTime="2025-12-01 20:36:01.702922503 +0000 UTC m=+2383.265482144" watchObservedRunningTime="2025-12-01 20:36:01.714424862 +0000 UTC m=+2383.276984503" Dec 01 20:36:06 crc kubenswrapper[4802]: I1201 20:36:06.720622 4802 scope.go:117] "RemoveContainer" containerID="e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88" Dec 01 20:36:06 crc kubenswrapper[4802]: E1201 20:36:06.721678 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:36:21 crc kubenswrapper[4802]: I1201 20:36:21.719678 4802 scope.go:117] "RemoveContainer" containerID="e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88" Dec 01 20:36:21 crc kubenswrapper[4802]: E1201 20:36:21.720450 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:36:26 crc kubenswrapper[4802]: I1201 20:36:26.601835 4802 scope.go:117] "RemoveContainer" containerID="698d459d0ded6c9b623351b9d47258461f3bf1cfbffa744ef5a1b015e67a7c56" Dec 01 20:36:26 crc kubenswrapper[4802]: I1201 20:36:26.643668 4802 scope.go:117] "RemoveContainer" containerID="0f4ff116b520e878f5afcd628b57664fc0a65580894d364a5d724670691557a9" Dec 01 20:36:26 crc kubenswrapper[4802]: I1201 20:36:26.912405 4802 generic.go:334] "Generic (PLEG): container finished" podID="3a154c18-5d93-4d73-9e97-90fb21082eea" containerID="e31e6e2b3fecef3a16be08a101d7a386f6638070ba7a4c6c188e9f9b58a418f0" exitCode=0 Dec 01 20:36:26 crc kubenswrapper[4802]: I1201 20:36:26.912454 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g62m8" event={"ID":"3a154c18-5d93-4d73-9e97-90fb21082eea","Type":"ContainerDied","Data":"e31e6e2b3fecef3a16be08a101d7a386f6638070ba7a4c6c188e9f9b58a418f0"} Dec 01 20:36:28 crc kubenswrapper[4802]: I1201 20:36:28.423465 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g62m8" Dec 01 20:36:28 crc kubenswrapper[4802]: I1201 20:36:28.513118 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3a154c18-5d93-4d73-9e97-90fb21082eea-ceph\") pod \"3a154c18-5d93-4d73-9e97-90fb21082eea\" (UID: \"3a154c18-5d93-4d73-9e97-90fb21082eea\") " Dec 01 20:36:28 crc kubenswrapper[4802]: I1201 20:36:28.513188 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a154c18-5d93-4d73-9e97-90fb21082eea-inventory\") pod \"3a154c18-5d93-4d73-9e97-90fb21082eea\" (UID: \"3a154c18-5d93-4d73-9e97-90fb21082eea\") " Dec 01 20:36:28 crc kubenswrapper[4802]: I1201 20:36:28.513261 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fshh9\" (UniqueName: \"kubernetes.io/projected/3a154c18-5d93-4d73-9e97-90fb21082eea-kube-api-access-fshh9\") pod \"3a154c18-5d93-4d73-9e97-90fb21082eea\" (UID: \"3a154c18-5d93-4d73-9e97-90fb21082eea\") " Dec 01 20:36:28 crc kubenswrapper[4802]: I1201 20:36:28.513474 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3a154c18-5d93-4d73-9e97-90fb21082eea-ssh-key\") pod \"3a154c18-5d93-4d73-9e97-90fb21082eea\" (UID: \"3a154c18-5d93-4d73-9e97-90fb21082eea\") " Dec 01 20:36:28 crc kubenswrapper[4802]: I1201 20:36:28.520259 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a154c18-5d93-4d73-9e97-90fb21082eea-ceph" (OuterVolumeSpecName: "ceph") pod "3a154c18-5d93-4d73-9e97-90fb21082eea" (UID: "3a154c18-5d93-4d73-9e97-90fb21082eea"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:36:28 crc kubenswrapper[4802]: I1201 20:36:28.520671 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a154c18-5d93-4d73-9e97-90fb21082eea-kube-api-access-fshh9" (OuterVolumeSpecName: "kube-api-access-fshh9") pod "3a154c18-5d93-4d73-9e97-90fb21082eea" (UID: "3a154c18-5d93-4d73-9e97-90fb21082eea"). InnerVolumeSpecName "kube-api-access-fshh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:36:28 crc kubenswrapper[4802]: I1201 20:36:28.541644 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a154c18-5d93-4d73-9e97-90fb21082eea-inventory" (OuterVolumeSpecName: "inventory") pod "3a154c18-5d93-4d73-9e97-90fb21082eea" (UID: "3a154c18-5d93-4d73-9e97-90fb21082eea"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:36:28 crc kubenswrapper[4802]: I1201 20:36:28.543349 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a154c18-5d93-4d73-9e97-90fb21082eea-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3a154c18-5d93-4d73-9e97-90fb21082eea" (UID: "3a154c18-5d93-4d73-9e97-90fb21082eea"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:36:28 crc kubenswrapper[4802]: I1201 20:36:28.614643 4802 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3a154c18-5d93-4d73-9e97-90fb21082eea-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 20:36:28 crc kubenswrapper[4802]: I1201 20:36:28.614674 4802 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a154c18-5d93-4d73-9e97-90fb21082eea-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 20:36:28 crc kubenswrapper[4802]: I1201 20:36:28.614684 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fshh9\" (UniqueName: \"kubernetes.io/projected/3a154c18-5d93-4d73-9e97-90fb21082eea-kube-api-access-fshh9\") on node \"crc\" DevicePath \"\"" Dec 01 20:36:28 crc kubenswrapper[4802]: I1201 20:36:28.614693 4802 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3a154c18-5d93-4d73-9e97-90fb21082eea-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 20:36:28 crc kubenswrapper[4802]: I1201 20:36:28.931405 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g62m8" event={"ID":"3a154c18-5d93-4d73-9e97-90fb21082eea","Type":"ContainerDied","Data":"4b7d194e495e333b144109eecbcc85f4b0722c4c8656ef4e15db768900d48252"} Dec 01 20:36:28 crc kubenswrapper[4802]: I1201 20:36:28.931458 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b7d194e495e333b144109eecbcc85f4b0722c4c8656ef4e15db768900d48252" Dec 01 20:36:28 crc kubenswrapper[4802]: I1201 20:36:28.931467 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-g62m8" Dec 01 20:36:29 crc kubenswrapper[4802]: I1201 20:36:29.041714 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vngdm"] Dec 01 20:36:29 crc kubenswrapper[4802]: E1201 20:36:29.042155 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a154c18-5d93-4d73-9e97-90fb21082eea" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 20:36:29 crc kubenswrapper[4802]: I1201 20:36:29.042176 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a154c18-5d93-4d73-9e97-90fb21082eea" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 20:36:29 crc kubenswrapper[4802]: I1201 20:36:29.042445 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a154c18-5d93-4d73-9e97-90fb21082eea" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 20:36:29 crc kubenswrapper[4802]: I1201 20:36:29.043102 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vngdm" Dec 01 20:36:29 crc kubenswrapper[4802]: I1201 20:36:29.045058 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 20:36:29 crc kubenswrapper[4802]: I1201 20:36:29.045524 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 20:36:29 crc kubenswrapper[4802]: I1201 20:36:29.046076 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wx8v9" Dec 01 20:36:29 crc kubenswrapper[4802]: I1201 20:36:29.049386 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 20:36:29 crc kubenswrapper[4802]: I1201 20:36:29.050412 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 20:36:29 crc kubenswrapper[4802]: I1201 20:36:29.059357 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vngdm"] Dec 01 20:36:29 crc kubenswrapper[4802]: I1201 20:36:29.223264 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv49g\" (UniqueName: \"kubernetes.io/projected/43faaae9-0df9-4e49-a5cc-2fc51e008edc-kube-api-access-xv49g\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vngdm\" (UID: \"43faaae9-0df9-4e49-a5cc-2fc51e008edc\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vngdm" Dec 01 20:36:29 crc kubenswrapper[4802]: I1201 20:36:29.223392 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/43faaae9-0df9-4e49-a5cc-2fc51e008edc-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vngdm\" (UID: \"43faaae9-0df9-4e49-a5cc-2fc51e008edc\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vngdm" Dec 01 20:36:29 crc kubenswrapper[4802]: I1201 20:36:29.223526 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43faaae9-0df9-4e49-a5cc-2fc51e008edc-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vngdm\" (UID: \"43faaae9-0df9-4e49-a5cc-2fc51e008edc\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vngdm" Dec 01 20:36:29 crc kubenswrapper[4802]: I1201 20:36:29.223621 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43faaae9-0df9-4e49-a5cc-2fc51e008edc-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vngdm\" (UID: \"43faaae9-0df9-4e49-a5cc-2fc51e008edc\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vngdm" Dec 01 20:36:29 crc kubenswrapper[4802]: I1201 20:36:29.326827 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43faaae9-0df9-4e49-a5cc-2fc51e008edc-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vngdm\" (UID: \"43faaae9-0df9-4e49-a5cc-2fc51e008edc\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vngdm" Dec 01 20:36:29 crc kubenswrapper[4802]: I1201 20:36:29.327012 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43faaae9-0df9-4e49-a5cc-2fc51e008edc-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vngdm\" (UID: \"43faaae9-0df9-4e49-a5cc-2fc51e008edc\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vngdm" Dec 01 20:36:29 crc kubenswrapper[4802]: I1201 20:36:29.327307 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv49g\" (UniqueName: \"kubernetes.io/projected/43faaae9-0df9-4e49-a5cc-2fc51e008edc-kube-api-access-xv49g\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vngdm\" (UID: \"43faaae9-0df9-4e49-a5cc-2fc51e008edc\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vngdm" Dec 01 20:36:29 crc kubenswrapper[4802]: I1201 20:36:29.327376 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/43faaae9-0df9-4e49-a5cc-2fc51e008edc-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vngdm\" (UID: \"43faaae9-0df9-4e49-a5cc-2fc51e008edc\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vngdm" Dec 01 20:36:29 crc kubenswrapper[4802]: I1201 20:36:29.335143 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43faaae9-0df9-4e49-a5cc-2fc51e008edc-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vngdm\" (UID: \"43faaae9-0df9-4e49-a5cc-2fc51e008edc\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vngdm" Dec 01 20:36:29 crc kubenswrapper[4802]: I1201 20:36:29.339330 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43faaae9-0df9-4e49-a5cc-2fc51e008edc-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vngdm\" (UID: \"43faaae9-0df9-4e49-a5cc-2fc51e008edc\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vngdm" Dec 01 20:36:29 crc kubenswrapper[4802]: I1201 20:36:29.352092 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv49g\" (UniqueName: \"kubernetes.io/projected/43faaae9-0df9-4e49-a5cc-2fc51e008edc-kube-api-access-xv49g\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vngdm\" (UID: \"43faaae9-0df9-4e49-a5cc-2fc51e008edc\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vngdm" Dec 01 20:36:29 crc kubenswrapper[4802]: I1201 20:36:29.376022 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/43faaae9-0df9-4e49-a5cc-2fc51e008edc-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vngdm\" (UID: \"43faaae9-0df9-4e49-a5cc-2fc51e008edc\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vngdm" Dec 01 20:36:29 crc kubenswrapper[4802]: I1201 20:36:29.662078 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vngdm" Dec 01 20:36:30 crc kubenswrapper[4802]: I1201 20:36:30.202450 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vngdm"] Dec 01 20:36:30 crc kubenswrapper[4802]: I1201 20:36:30.979898 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vngdm" event={"ID":"43faaae9-0df9-4e49-a5cc-2fc51e008edc","Type":"ContainerStarted","Data":"a7dbe496b4645ea012a15758e3b8fc502615e97c2c1cf2556a1ade84271a95d6"} Dec 01 20:36:31 crc kubenswrapper[4802]: I1201 20:36:31.992336 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vngdm" event={"ID":"43faaae9-0df9-4e49-a5cc-2fc51e008edc","Type":"ContainerStarted","Data":"505ef28911a67e5153b7bf0281937ba63f91130b01e3ef2915a270c691609605"} Dec 01 20:36:32 crc kubenswrapper[4802]: I1201 20:36:32.018782 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vngdm" podStartSLOduration=2.3550501329999998 podStartE2EDuration="3.018761166s" podCreationTimestamp="2025-12-01 20:36:29 +0000 UTC" firstStartedPulling="2025-12-01 20:36:30.206276919 +0000 UTC m=+2411.768836560" lastFinishedPulling="2025-12-01 20:36:30.869987932 +0000 UTC m=+2412.432547593" observedRunningTime="2025-12-01 20:36:32.01758696 +0000 UTC m=+2413.580146721" watchObservedRunningTime="2025-12-01 20:36:32.018761166 +0000 UTC m=+2413.581320837" Dec 01 20:36:32 crc kubenswrapper[4802]: I1201 20:36:32.720721 4802 scope.go:117] "RemoveContainer" containerID="e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88" Dec 01 20:36:32 crc kubenswrapper[4802]: E1201 20:36:32.721024 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:36:36 crc kubenswrapper[4802]: I1201 20:36:36.030492 4802 generic.go:334] "Generic (PLEG): container finished" podID="43faaae9-0df9-4e49-a5cc-2fc51e008edc" containerID="505ef28911a67e5153b7bf0281937ba63f91130b01e3ef2915a270c691609605" exitCode=0 Dec 01 20:36:36 crc kubenswrapper[4802]: I1201 20:36:36.030599 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vngdm" event={"ID":"43faaae9-0df9-4e49-a5cc-2fc51e008edc","Type":"ContainerDied","Data":"505ef28911a67e5153b7bf0281937ba63f91130b01e3ef2915a270c691609605"} Dec 01 20:36:37 crc kubenswrapper[4802]: I1201 20:36:37.433681 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vngdm" Dec 01 20:36:37 crc kubenswrapper[4802]: I1201 20:36:37.616793 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv49g\" (UniqueName: \"kubernetes.io/projected/43faaae9-0df9-4e49-a5cc-2fc51e008edc-kube-api-access-xv49g\") pod \"43faaae9-0df9-4e49-a5cc-2fc51e008edc\" (UID: \"43faaae9-0df9-4e49-a5cc-2fc51e008edc\") " Dec 01 20:36:37 crc kubenswrapper[4802]: I1201 20:36:37.616967 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43faaae9-0df9-4e49-a5cc-2fc51e008edc-ssh-key\") pod \"43faaae9-0df9-4e49-a5cc-2fc51e008edc\" (UID: \"43faaae9-0df9-4e49-a5cc-2fc51e008edc\") " Dec 01 20:36:37 crc kubenswrapper[4802]: I1201 20:36:37.617366 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/43faaae9-0df9-4e49-a5cc-2fc51e008edc-ceph\") pod \"43faaae9-0df9-4e49-a5cc-2fc51e008edc\" (UID: \"43faaae9-0df9-4e49-a5cc-2fc51e008edc\") " Dec 01 20:36:37 crc kubenswrapper[4802]: I1201 20:36:37.617450 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43faaae9-0df9-4e49-a5cc-2fc51e008edc-inventory\") pod \"43faaae9-0df9-4e49-a5cc-2fc51e008edc\" (UID: \"43faaae9-0df9-4e49-a5cc-2fc51e008edc\") " Dec 01 20:36:37 crc kubenswrapper[4802]: I1201 20:36:37.623234 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43faaae9-0df9-4e49-a5cc-2fc51e008edc-ceph" (OuterVolumeSpecName: "ceph") pod "43faaae9-0df9-4e49-a5cc-2fc51e008edc" (UID: "43faaae9-0df9-4e49-a5cc-2fc51e008edc"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:36:37 crc kubenswrapper[4802]: I1201 20:36:37.623931 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43faaae9-0df9-4e49-a5cc-2fc51e008edc-kube-api-access-xv49g" (OuterVolumeSpecName: "kube-api-access-xv49g") pod "43faaae9-0df9-4e49-a5cc-2fc51e008edc" (UID: "43faaae9-0df9-4e49-a5cc-2fc51e008edc"). InnerVolumeSpecName "kube-api-access-xv49g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:36:37 crc kubenswrapper[4802]: I1201 20:36:37.644445 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43faaae9-0df9-4e49-a5cc-2fc51e008edc-inventory" (OuterVolumeSpecName: "inventory") pod "43faaae9-0df9-4e49-a5cc-2fc51e008edc" (UID: "43faaae9-0df9-4e49-a5cc-2fc51e008edc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:36:37 crc kubenswrapper[4802]: I1201 20:36:37.664304 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43faaae9-0df9-4e49-a5cc-2fc51e008edc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "43faaae9-0df9-4e49-a5cc-2fc51e008edc" (UID: "43faaae9-0df9-4e49-a5cc-2fc51e008edc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:36:37 crc kubenswrapper[4802]: I1201 20:36:37.720468 4802 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43faaae9-0df9-4e49-a5cc-2fc51e008edc-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 20:36:37 crc kubenswrapper[4802]: I1201 20:36:37.720754 4802 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/43faaae9-0df9-4e49-a5cc-2fc51e008edc-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 20:36:37 crc kubenswrapper[4802]: I1201 20:36:37.720763 4802 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43faaae9-0df9-4e49-a5cc-2fc51e008edc-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 20:36:37 crc kubenswrapper[4802]: I1201 20:36:37.720775 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xv49g\" (UniqueName: \"kubernetes.io/projected/43faaae9-0df9-4e49-a5cc-2fc51e008edc-kube-api-access-xv49g\") on node \"crc\" DevicePath \"\"" Dec 01 20:36:38 crc kubenswrapper[4802]: I1201 20:36:38.047579 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vngdm" event={"ID":"43faaae9-0df9-4e49-a5cc-2fc51e008edc","Type":"ContainerDied","Data":"a7dbe496b4645ea012a15758e3b8fc502615e97c2c1cf2556a1ade84271a95d6"} Dec 01 20:36:38 crc kubenswrapper[4802]: I1201 20:36:38.047617 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7dbe496b4645ea012a15758e3b8fc502615e97c2c1cf2556a1ade84271a95d6" Dec 01 20:36:38 crc kubenswrapper[4802]: I1201 20:36:38.047701 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vngdm" Dec 01 20:36:38 crc kubenswrapper[4802]: I1201 20:36:38.134806 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwtvq"] Dec 01 20:36:38 crc kubenswrapper[4802]: E1201 20:36:38.135163 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43faaae9-0df9-4e49-a5cc-2fc51e008edc" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 20:36:38 crc kubenswrapper[4802]: I1201 20:36:38.135181 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="43faaae9-0df9-4e49-a5cc-2fc51e008edc" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 20:36:38 crc kubenswrapper[4802]: I1201 20:36:38.135448 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="43faaae9-0df9-4e49-a5cc-2fc51e008edc" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 20:36:38 crc kubenswrapper[4802]: I1201 20:36:38.136100 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwtvq" Dec 01 20:36:38 crc kubenswrapper[4802]: I1201 20:36:38.138776 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 20:36:38 crc kubenswrapper[4802]: I1201 20:36:38.140883 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 20:36:38 crc kubenswrapper[4802]: I1201 20:36:38.144526 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 20:36:38 crc kubenswrapper[4802]: I1201 20:36:38.144591 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wx8v9" Dec 01 20:36:38 crc kubenswrapper[4802]: I1201 20:36:38.144920 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 20:36:38 crc kubenswrapper[4802]: I1201 20:36:38.153038 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwtvq"] Dec 01 20:36:38 crc kubenswrapper[4802]: I1201 20:36:38.231428 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq42s\" (UniqueName: \"kubernetes.io/projected/ba286b73-fe12-499f-b959-296217015c6b-kube-api-access-tq42s\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mwtvq\" (UID: \"ba286b73-fe12-499f-b959-296217015c6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwtvq" Dec 01 20:36:38 crc kubenswrapper[4802]: I1201 20:36:38.231506 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ba286b73-fe12-499f-b959-296217015c6b-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mwtvq\" (UID: \"ba286b73-fe12-499f-b959-296217015c6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwtvq" Dec 01 20:36:38 crc kubenswrapper[4802]: I1201 20:36:38.231726 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba286b73-fe12-499f-b959-296217015c6b-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mwtvq\" (UID: \"ba286b73-fe12-499f-b959-296217015c6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwtvq" Dec 01 20:36:38 crc kubenswrapper[4802]: I1201 20:36:38.231845 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ba286b73-fe12-499f-b959-296217015c6b-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mwtvq\" (UID: \"ba286b73-fe12-499f-b959-296217015c6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwtvq" Dec 01 20:36:38 crc kubenswrapper[4802]: I1201 20:36:38.334818 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba286b73-fe12-499f-b959-296217015c6b-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mwtvq\" (UID: \"ba286b73-fe12-499f-b959-296217015c6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwtvq" Dec 01 20:36:38 crc kubenswrapper[4802]: I1201 20:36:38.335367 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ba286b73-fe12-499f-b959-296217015c6b-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mwtvq\" (UID: \"ba286b73-fe12-499f-b959-296217015c6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwtvq" Dec 01 20:36:38 crc kubenswrapper[4802]: I1201 20:36:38.335468 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq42s\" (UniqueName: \"kubernetes.io/projected/ba286b73-fe12-499f-b959-296217015c6b-kube-api-access-tq42s\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mwtvq\" (UID: \"ba286b73-fe12-499f-b959-296217015c6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwtvq" Dec 01 20:36:38 crc kubenswrapper[4802]: I1201 20:36:38.335548 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ba286b73-fe12-499f-b959-296217015c6b-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mwtvq\" (UID: \"ba286b73-fe12-499f-b959-296217015c6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwtvq" Dec 01 20:36:38 crc kubenswrapper[4802]: I1201 20:36:38.340477 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ba286b73-fe12-499f-b959-296217015c6b-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mwtvq\" (UID: \"ba286b73-fe12-499f-b959-296217015c6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwtvq" Dec 01 20:36:38 crc kubenswrapper[4802]: I1201 20:36:38.345943 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ba286b73-fe12-499f-b959-296217015c6b-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mwtvq\" (UID: \"ba286b73-fe12-499f-b959-296217015c6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwtvq" Dec 01 20:36:38 crc kubenswrapper[4802]: I1201 20:36:38.346175 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba286b73-fe12-499f-b959-296217015c6b-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mwtvq\" (UID: \"ba286b73-fe12-499f-b959-296217015c6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwtvq" Dec 01 20:36:38 crc kubenswrapper[4802]: I1201 20:36:38.359620 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq42s\" (UniqueName: \"kubernetes.io/projected/ba286b73-fe12-499f-b959-296217015c6b-kube-api-access-tq42s\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mwtvq\" (UID: \"ba286b73-fe12-499f-b959-296217015c6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwtvq" Dec 01 20:36:38 crc kubenswrapper[4802]: I1201 20:36:38.472887 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwtvq" Dec 01 20:36:39 crc kubenswrapper[4802]: I1201 20:36:39.013534 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwtvq"] Dec 01 20:36:39 crc kubenswrapper[4802]: W1201 20:36:39.031578 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba286b73_fe12_499f_b959_296217015c6b.slice/crio-314e059ae8a3ab334b85ab46f4f278261c60cbc612f06ba9ed01e76cbe8035c2 WatchSource:0}: Error finding container 314e059ae8a3ab334b85ab46f4f278261c60cbc612f06ba9ed01e76cbe8035c2: Status 404 returned error can't find the container with id 314e059ae8a3ab334b85ab46f4f278261c60cbc612f06ba9ed01e76cbe8035c2 Dec 01 20:36:39 crc kubenswrapper[4802]: I1201 20:36:39.033758 4802 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 20:36:39 crc kubenswrapper[4802]: I1201 20:36:39.066320 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwtvq" event={"ID":"ba286b73-fe12-499f-b959-296217015c6b","Type":"ContainerStarted","Data":"314e059ae8a3ab334b85ab46f4f278261c60cbc612f06ba9ed01e76cbe8035c2"} Dec 01 20:36:40 crc kubenswrapper[4802]: I1201 20:36:40.077822 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwtvq" event={"ID":"ba286b73-fe12-499f-b959-296217015c6b","Type":"ContainerStarted","Data":"701bd31ed84c7a53102800fdc1a5d0f947c7da8e6f26b86ad7452fd232b45b99"} Dec 01 20:36:40 crc kubenswrapper[4802]: I1201 20:36:40.102065 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwtvq" podStartSLOduration=1.6274641079999999 podStartE2EDuration="2.102047682s" podCreationTimestamp="2025-12-01 20:36:38 +0000 UTC" firstStartedPulling="2025-12-01 20:36:39.033499303 +0000 UTC m=+2420.596058944" lastFinishedPulling="2025-12-01 20:36:39.508082877 +0000 UTC m=+2421.070642518" observedRunningTime="2025-12-01 20:36:40.098001555 +0000 UTC m=+2421.660561196" watchObservedRunningTime="2025-12-01 20:36:40.102047682 +0000 UTC m=+2421.664607323" Dec 01 20:36:47 crc kubenswrapper[4802]: I1201 20:36:47.720636 4802 scope.go:117] "RemoveContainer" containerID="e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88" Dec 01 20:36:47 crc kubenswrapper[4802]: E1201 20:36:47.725166 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:36:57 crc kubenswrapper[4802]: I1201 20:36:57.353186 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wh5rx"] Dec 01 20:36:57 crc kubenswrapper[4802]: I1201 20:36:57.355416 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wh5rx" Dec 01 20:36:57 crc kubenswrapper[4802]: I1201 20:36:57.366624 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wh5rx"] Dec 01 20:36:57 crc kubenswrapper[4802]: I1201 20:36:57.532613 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9db6a82a-82d0-4661-a6f3-31a5726eb89d-utilities\") pod \"community-operators-wh5rx\" (UID: \"9db6a82a-82d0-4661-a6f3-31a5726eb89d\") " pod="openshift-marketplace/community-operators-wh5rx" Dec 01 20:36:57 crc kubenswrapper[4802]: I1201 20:36:57.532728 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9db6a82a-82d0-4661-a6f3-31a5726eb89d-catalog-content\") pod \"community-operators-wh5rx\" (UID: \"9db6a82a-82d0-4661-a6f3-31a5726eb89d\") " pod="openshift-marketplace/community-operators-wh5rx" Dec 01 20:36:57 crc kubenswrapper[4802]: I1201 20:36:57.533101 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6vp4\" (UniqueName: \"kubernetes.io/projected/9db6a82a-82d0-4661-a6f3-31a5726eb89d-kube-api-access-x6vp4\") pod \"community-operators-wh5rx\" (UID: \"9db6a82a-82d0-4661-a6f3-31a5726eb89d\") " pod="openshift-marketplace/community-operators-wh5rx" Dec 01 20:36:57 crc kubenswrapper[4802]: I1201 20:36:57.634844 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9db6a82a-82d0-4661-a6f3-31a5726eb89d-utilities\") pod \"community-operators-wh5rx\" (UID: \"9db6a82a-82d0-4661-a6f3-31a5726eb89d\") " pod="openshift-marketplace/community-operators-wh5rx" Dec 01 20:36:57 crc kubenswrapper[4802]: I1201 20:36:57.634887 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9db6a82a-82d0-4661-a6f3-31a5726eb89d-catalog-content\") pod \"community-operators-wh5rx\" (UID: \"9db6a82a-82d0-4661-a6f3-31a5726eb89d\") " pod="openshift-marketplace/community-operators-wh5rx" Dec 01 20:36:57 crc kubenswrapper[4802]: I1201 20:36:57.635008 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6vp4\" (UniqueName: \"kubernetes.io/projected/9db6a82a-82d0-4661-a6f3-31a5726eb89d-kube-api-access-x6vp4\") pod \"community-operators-wh5rx\" (UID: \"9db6a82a-82d0-4661-a6f3-31a5726eb89d\") " pod="openshift-marketplace/community-operators-wh5rx" Dec 01 20:36:57 crc kubenswrapper[4802]: I1201 20:36:57.635667 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9db6a82a-82d0-4661-a6f3-31a5726eb89d-catalog-content\") pod \"community-operators-wh5rx\" (UID: \"9db6a82a-82d0-4661-a6f3-31a5726eb89d\") " pod="openshift-marketplace/community-operators-wh5rx" Dec 01 20:36:57 crc kubenswrapper[4802]: I1201 20:36:57.636051 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9db6a82a-82d0-4661-a6f3-31a5726eb89d-utilities\") pod \"community-operators-wh5rx\" (UID: \"9db6a82a-82d0-4661-a6f3-31a5726eb89d\") " pod="openshift-marketplace/community-operators-wh5rx" Dec 01 20:36:57 crc kubenswrapper[4802]: I1201 20:36:57.658289 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6vp4\" (UniqueName: \"kubernetes.io/projected/9db6a82a-82d0-4661-a6f3-31a5726eb89d-kube-api-access-x6vp4\") pod \"community-operators-wh5rx\" (UID: \"9db6a82a-82d0-4661-a6f3-31a5726eb89d\") " pod="openshift-marketplace/community-operators-wh5rx" Dec 01 20:36:57 crc kubenswrapper[4802]: I1201 20:36:57.694310 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wh5rx" Dec 01 20:36:58 crc kubenswrapper[4802]: I1201 20:36:58.205356 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wh5rx"] Dec 01 20:36:58 crc kubenswrapper[4802]: I1201 20:36:58.227589 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wh5rx" event={"ID":"9db6a82a-82d0-4661-a6f3-31a5726eb89d","Type":"ContainerStarted","Data":"38251d2d64a07ed6356eccab99a8bb693c3aaf1352501814d2ec1c4144a7ef64"} Dec 01 20:36:59 crc kubenswrapper[4802]: I1201 20:36:59.237760 4802 generic.go:334] "Generic (PLEG): container finished" podID="9db6a82a-82d0-4661-a6f3-31a5726eb89d" containerID="ffb52d317339487b887714f1a35c6bf8e4b597542b44790f16f2f8043ae815b1" exitCode=0 Dec 01 20:36:59 crc kubenswrapper[4802]: I1201 20:36:59.237823 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wh5rx" event={"ID":"9db6a82a-82d0-4661-a6f3-31a5726eb89d","Type":"ContainerDied","Data":"ffb52d317339487b887714f1a35c6bf8e4b597542b44790f16f2f8043ae815b1"} Dec 01 20:36:59 crc kubenswrapper[4802]: I1201 20:36:59.721359 4802 scope.go:117] "RemoveContainer" containerID="e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88" Dec 01 20:36:59 crc kubenswrapper[4802]: E1201 20:36:59.721696 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:37:02 crc kubenswrapper[4802]: I1201 20:37:02.277812 4802 generic.go:334] "Generic (PLEG): container finished" podID="9db6a82a-82d0-4661-a6f3-31a5726eb89d" containerID="09ef496b9920533bb87eb9ca78ea3e9d2f62b56ed0977d5e962f34637511b311" exitCode=0 Dec 01 20:37:02 crc kubenswrapper[4802]: I1201 20:37:02.277906 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wh5rx" event={"ID":"9db6a82a-82d0-4661-a6f3-31a5726eb89d","Type":"ContainerDied","Data":"09ef496b9920533bb87eb9ca78ea3e9d2f62b56ed0977d5e962f34637511b311"} Dec 01 20:37:03 crc kubenswrapper[4802]: I1201 20:37:03.288233 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wh5rx" event={"ID":"9db6a82a-82d0-4661-a6f3-31a5726eb89d","Type":"ContainerStarted","Data":"cf814e1b3147ed3f7a4d84ff025491da10edea7566019ce5dfeaca71f5c4be1a"} Dec 01 20:37:03 crc kubenswrapper[4802]: I1201 20:37:03.308597 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wh5rx" podStartSLOduration=2.660863878 podStartE2EDuration="6.308581435s" podCreationTimestamp="2025-12-01 20:36:57 +0000 UTC" firstStartedPulling="2025-12-01 20:36:59.240827087 +0000 UTC m=+2440.803386728" lastFinishedPulling="2025-12-01 20:37:02.888544644 +0000 UTC m=+2444.451104285" observedRunningTime="2025-12-01 20:37:03.302164375 +0000 UTC m=+2444.864724026" watchObservedRunningTime="2025-12-01 20:37:03.308581435 +0000 UTC m=+2444.871141076" Dec 01 20:37:07 crc kubenswrapper[4802]: I1201 20:37:07.694540 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wh5rx" Dec 01 20:37:07 crc kubenswrapper[4802]: I1201 20:37:07.695132 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wh5rx" Dec 01 20:37:07 crc kubenswrapper[4802]: I1201 20:37:07.745775 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wh5rx" Dec 01 20:37:08 crc kubenswrapper[4802]: I1201 20:37:08.394026 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wh5rx" Dec 01 20:37:08 crc kubenswrapper[4802]: I1201 20:37:08.442853 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wh5rx"] Dec 01 20:37:10 crc kubenswrapper[4802]: I1201 20:37:10.354278 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wh5rx" podUID="9db6a82a-82d0-4661-a6f3-31a5726eb89d" containerName="registry-server" containerID="cri-o://cf814e1b3147ed3f7a4d84ff025491da10edea7566019ce5dfeaca71f5c4be1a" gracePeriod=2 Dec 01 20:37:10 crc kubenswrapper[4802]: I1201 20:37:10.839791 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wh5rx" Dec 01 20:37:10 crc kubenswrapper[4802]: I1201 20:37:10.992176 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9db6a82a-82d0-4661-a6f3-31a5726eb89d-utilities\") pod \"9db6a82a-82d0-4661-a6f3-31a5726eb89d\" (UID: \"9db6a82a-82d0-4661-a6f3-31a5726eb89d\") " Dec 01 20:37:10 crc kubenswrapper[4802]: I1201 20:37:10.992615 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9db6a82a-82d0-4661-a6f3-31a5726eb89d-catalog-content\") pod \"9db6a82a-82d0-4661-a6f3-31a5726eb89d\" (UID: \"9db6a82a-82d0-4661-a6f3-31a5726eb89d\") " Dec 01 20:37:10 crc kubenswrapper[4802]: I1201 20:37:10.992885 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6vp4\" (UniqueName: \"kubernetes.io/projected/9db6a82a-82d0-4661-a6f3-31a5726eb89d-kube-api-access-x6vp4\") pod \"9db6a82a-82d0-4661-a6f3-31a5726eb89d\" (UID: \"9db6a82a-82d0-4661-a6f3-31a5726eb89d\") " Dec 01 20:37:10 crc kubenswrapper[4802]: I1201 20:37:10.993235 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9db6a82a-82d0-4661-a6f3-31a5726eb89d-utilities" (OuterVolumeSpecName: "utilities") pod "9db6a82a-82d0-4661-a6f3-31a5726eb89d" (UID: "9db6a82a-82d0-4661-a6f3-31a5726eb89d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:37:10 crc kubenswrapper[4802]: I1201 20:37:10.993578 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9db6a82a-82d0-4661-a6f3-31a5726eb89d-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 20:37:11 crc kubenswrapper[4802]: I1201 20:37:11.003538 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9db6a82a-82d0-4661-a6f3-31a5726eb89d-kube-api-access-x6vp4" (OuterVolumeSpecName: "kube-api-access-x6vp4") pod "9db6a82a-82d0-4661-a6f3-31a5726eb89d" (UID: "9db6a82a-82d0-4661-a6f3-31a5726eb89d"). InnerVolumeSpecName "kube-api-access-x6vp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:37:11 crc kubenswrapper[4802]: I1201 20:37:11.045313 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9db6a82a-82d0-4661-a6f3-31a5726eb89d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9db6a82a-82d0-4661-a6f3-31a5726eb89d" (UID: "9db6a82a-82d0-4661-a6f3-31a5726eb89d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:37:11 crc kubenswrapper[4802]: I1201 20:37:11.095048 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9db6a82a-82d0-4661-a6f3-31a5726eb89d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 20:37:11 crc kubenswrapper[4802]: I1201 20:37:11.095081 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6vp4\" (UniqueName: \"kubernetes.io/projected/9db6a82a-82d0-4661-a6f3-31a5726eb89d-kube-api-access-x6vp4\") on node \"crc\" DevicePath \"\"" Dec 01 20:37:11 crc kubenswrapper[4802]: I1201 20:37:11.364383 4802 generic.go:334] "Generic (PLEG): container finished" podID="9db6a82a-82d0-4661-a6f3-31a5726eb89d" containerID="cf814e1b3147ed3f7a4d84ff025491da10edea7566019ce5dfeaca71f5c4be1a" exitCode=0 Dec 01 20:37:11 crc kubenswrapper[4802]: I1201 20:37:11.364539 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wh5rx" Dec 01 20:37:11 crc kubenswrapper[4802]: I1201 20:37:11.364471 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wh5rx" event={"ID":"9db6a82a-82d0-4661-a6f3-31a5726eb89d","Type":"ContainerDied","Data":"cf814e1b3147ed3f7a4d84ff025491da10edea7566019ce5dfeaca71f5c4be1a"} Dec 01 20:37:11 crc kubenswrapper[4802]: I1201 20:37:11.365652 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wh5rx" event={"ID":"9db6a82a-82d0-4661-a6f3-31a5726eb89d","Type":"ContainerDied","Data":"38251d2d64a07ed6356eccab99a8bb693c3aaf1352501814d2ec1c4144a7ef64"} Dec 01 20:37:11 crc kubenswrapper[4802]: I1201 20:37:11.365710 4802 scope.go:117] "RemoveContainer" containerID="cf814e1b3147ed3f7a4d84ff025491da10edea7566019ce5dfeaca71f5c4be1a" Dec 01 20:37:11 crc kubenswrapper[4802]: I1201 20:37:11.399776 4802 scope.go:117] "RemoveContainer" containerID="09ef496b9920533bb87eb9ca78ea3e9d2f62b56ed0977d5e962f34637511b311" Dec 01 20:37:11 crc kubenswrapper[4802]: I1201 20:37:11.428579 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wh5rx"] Dec 01 20:37:11 crc kubenswrapper[4802]: I1201 20:37:11.441156 4802 scope.go:117] "RemoveContainer" containerID="ffb52d317339487b887714f1a35c6bf8e4b597542b44790f16f2f8043ae815b1" Dec 01 20:37:11 crc kubenswrapper[4802]: I1201 20:37:11.443649 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wh5rx"] Dec 01 20:37:11 crc kubenswrapper[4802]: I1201 20:37:11.471528 4802 scope.go:117] "RemoveContainer" containerID="cf814e1b3147ed3f7a4d84ff025491da10edea7566019ce5dfeaca71f5c4be1a" Dec 01 20:37:11 crc kubenswrapper[4802]: E1201 20:37:11.472623 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf814e1b3147ed3f7a4d84ff025491da10edea7566019ce5dfeaca71f5c4be1a\": container with ID starting with cf814e1b3147ed3f7a4d84ff025491da10edea7566019ce5dfeaca71f5c4be1a not found: ID does not exist" containerID="cf814e1b3147ed3f7a4d84ff025491da10edea7566019ce5dfeaca71f5c4be1a" Dec 01 20:37:11 crc kubenswrapper[4802]: I1201 20:37:11.472666 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf814e1b3147ed3f7a4d84ff025491da10edea7566019ce5dfeaca71f5c4be1a"} err="failed to get container status \"cf814e1b3147ed3f7a4d84ff025491da10edea7566019ce5dfeaca71f5c4be1a\": rpc error: code = NotFound desc = could not find container \"cf814e1b3147ed3f7a4d84ff025491da10edea7566019ce5dfeaca71f5c4be1a\": container with ID starting with cf814e1b3147ed3f7a4d84ff025491da10edea7566019ce5dfeaca71f5c4be1a not found: ID does not exist" Dec 01 20:37:11 crc kubenswrapper[4802]: I1201 20:37:11.472692 4802 scope.go:117] "RemoveContainer" containerID="09ef496b9920533bb87eb9ca78ea3e9d2f62b56ed0977d5e962f34637511b311" Dec 01 20:37:11 crc kubenswrapper[4802]: E1201 20:37:11.473000 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09ef496b9920533bb87eb9ca78ea3e9d2f62b56ed0977d5e962f34637511b311\": container with ID starting with 09ef496b9920533bb87eb9ca78ea3e9d2f62b56ed0977d5e962f34637511b311 not found: ID does not exist" containerID="09ef496b9920533bb87eb9ca78ea3e9d2f62b56ed0977d5e962f34637511b311" Dec 01 20:37:11 crc kubenswrapper[4802]: I1201 20:37:11.473037 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09ef496b9920533bb87eb9ca78ea3e9d2f62b56ed0977d5e962f34637511b311"} err="failed to get container status \"09ef496b9920533bb87eb9ca78ea3e9d2f62b56ed0977d5e962f34637511b311\": rpc error: code = NotFound desc = could not find container \"09ef496b9920533bb87eb9ca78ea3e9d2f62b56ed0977d5e962f34637511b311\": container with ID starting with 09ef496b9920533bb87eb9ca78ea3e9d2f62b56ed0977d5e962f34637511b311 not found: ID does not exist" Dec 01 20:37:11 crc kubenswrapper[4802]: I1201 20:37:11.473057 4802 scope.go:117] "RemoveContainer" containerID="ffb52d317339487b887714f1a35c6bf8e4b597542b44790f16f2f8043ae815b1" Dec 01 20:37:11 crc kubenswrapper[4802]: E1201 20:37:11.473523 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffb52d317339487b887714f1a35c6bf8e4b597542b44790f16f2f8043ae815b1\": container with ID starting with ffb52d317339487b887714f1a35c6bf8e4b597542b44790f16f2f8043ae815b1 not found: ID does not exist" containerID="ffb52d317339487b887714f1a35c6bf8e4b597542b44790f16f2f8043ae815b1" Dec 01 20:37:11 crc kubenswrapper[4802]: I1201 20:37:11.473545 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffb52d317339487b887714f1a35c6bf8e4b597542b44790f16f2f8043ae815b1"} err="failed to get container status \"ffb52d317339487b887714f1a35c6bf8e4b597542b44790f16f2f8043ae815b1\": rpc error: code = NotFound desc = could not find container \"ffb52d317339487b887714f1a35c6bf8e4b597542b44790f16f2f8043ae815b1\": container with ID starting with ffb52d317339487b887714f1a35c6bf8e4b597542b44790f16f2f8043ae815b1 not found: ID does not exist" Dec 01 20:37:12 crc kubenswrapper[4802]: I1201 20:37:12.720648 4802 scope.go:117] "RemoveContainer" containerID="e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88" Dec 01 20:37:12 crc kubenswrapper[4802]: E1201 20:37:12.721776 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:37:12 crc kubenswrapper[4802]: I1201 20:37:12.740082 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9db6a82a-82d0-4661-a6f3-31a5726eb89d" path="/var/lib/kubelet/pods/9db6a82a-82d0-4661-a6f3-31a5726eb89d/volumes" Dec 01 20:37:15 crc kubenswrapper[4802]: I1201 20:37:15.402054 4802 generic.go:334] "Generic (PLEG): container finished" podID="ba286b73-fe12-499f-b959-296217015c6b" containerID="701bd31ed84c7a53102800fdc1a5d0f947c7da8e6f26b86ad7452fd232b45b99" exitCode=0 Dec 01 20:37:15 crc kubenswrapper[4802]: I1201 20:37:15.402159 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwtvq" event={"ID":"ba286b73-fe12-499f-b959-296217015c6b","Type":"ContainerDied","Data":"701bd31ed84c7a53102800fdc1a5d0f947c7da8e6f26b86ad7452fd232b45b99"} Dec 01 20:37:16 crc kubenswrapper[4802]: I1201 20:37:16.828853 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwtvq" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.003943 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba286b73-fe12-499f-b959-296217015c6b-inventory\") pod \"ba286b73-fe12-499f-b959-296217015c6b\" (UID: \"ba286b73-fe12-499f-b959-296217015c6b\") " Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.004061 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq42s\" (UniqueName: \"kubernetes.io/projected/ba286b73-fe12-499f-b959-296217015c6b-kube-api-access-tq42s\") pod \"ba286b73-fe12-499f-b959-296217015c6b\" (UID: \"ba286b73-fe12-499f-b959-296217015c6b\") " Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.004107 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ba286b73-fe12-499f-b959-296217015c6b-ceph\") pod \"ba286b73-fe12-499f-b959-296217015c6b\" (UID: \"ba286b73-fe12-499f-b959-296217015c6b\") " Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.004259 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ba286b73-fe12-499f-b959-296217015c6b-ssh-key\") pod \"ba286b73-fe12-499f-b959-296217015c6b\" (UID: \"ba286b73-fe12-499f-b959-296217015c6b\") " Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.009503 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba286b73-fe12-499f-b959-296217015c6b-kube-api-access-tq42s" (OuterVolumeSpecName: "kube-api-access-tq42s") pod "ba286b73-fe12-499f-b959-296217015c6b" (UID: "ba286b73-fe12-499f-b959-296217015c6b"). InnerVolumeSpecName "kube-api-access-tq42s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.011071 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba286b73-fe12-499f-b959-296217015c6b-ceph" (OuterVolumeSpecName: "ceph") pod "ba286b73-fe12-499f-b959-296217015c6b" (UID: "ba286b73-fe12-499f-b959-296217015c6b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.037571 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba286b73-fe12-499f-b959-296217015c6b-inventory" (OuterVolumeSpecName: "inventory") pod "ba286b73-fe12-499f-b959-296217015c6b" (UID: "ba286b73-fe12-499f-b959-296217015c6b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.044587 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba286b73-fe12-499f-b959-296217015c6b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ba286b73-fe12-499f-b959-296217015c6b" (UID: "ba286b73-fe12-499f-b959-296217015c6b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.106600 4802 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ba286b73-fe12-499f-b959-296217015c6b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.106637 4802 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba286b73-fe12-499f-b959-296217015c6b-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.106651 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tq42s\" (UniqueName: \"kubernetes.io/projected/ba286b73-fe12-499f-b959-296217015c6b-kube-api-access-tq42s\") on node \"crc\" DevicePath \"\"" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.106665 4802 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ba286b73-fe12-499f-b959-296217015c6b-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.420083 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwtvq" event={"ID":"ba286b73-fe12-499f-b959-296217015c6b","Type":"ContainerDied","Data":"314e059ae8a3ab334b85ab46f4f278261c60cbc612f06ba9ed01e76cbe8035c2"} Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.420131 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="314e059ae8a3ab334b85ab46f4f278261c60cbc612f06ba9ed01e76cbe8035c2" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.420152 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwtvq" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.509972 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2"] Dec 01 20:37:17 crc kubenswrapper[4802]: E1201 20:37:17.510365 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9db6a82a-82d0-4661-a6f3-31a5726eb89d" containerName="extract-content" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.510383 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="9db6a82a-82d0-4661-a6f3-31a5726eb89d" containerName="extract-content" Dec 01 20:37:17 crc kubenswrapper[4802]: E1201 20:37:17.510400 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9db6a82a-82d0-4661-a6f3-31a5726eb89d" containerName="registry-server" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.510406 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="9db6a82a-82d0-4661-a6f3-31a5726eb89d" containerName="registry-server" Dec 01 20:37:17 crc kubenswrapper[4802]: E1201 20:37:17.510419 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba286b73-fe12-499f-b959-296217015c6b" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.510427 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba286b73-fe12-499f-b959-296217015c6b" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 20:37:17 crc kubenswrapper[4802]: E1201 20:37:17.510450 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9db6a82a-82d0-4661-a6f3-31a5726eb89d" containerName="extract-utilities" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.510456 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="9db6a82a-82d0-4661-a6f3-31a5726eb89d" containerName="extract-utilities" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.510605 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba286b73-fe12-499f-b959-296217015c6b" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.510623 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="9db6a82a-82d0-4661-a6f3-31a5726eb89d" containerName="registry-server" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.511142 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.514420 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.514461 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.514640 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wx8v9" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.515172 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.515565 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.530740 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2"] Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.614414 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b81c691-0a4b-48b6-b1e1-151e1cac847c-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2\" (UID: \"0b81c691-0a4b-48b6-b1e1-151e1cac847c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.614743 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b81c691-0a4b-48b6-b1e1-151e1cac847c-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2\" (UID: \"0b81c691-0a4b-48b6-b1e1-151e1cac847c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.614797 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b81c691-0a4b-48b6-b1e1-151e1cac847c-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2\" (UID: \"0b81c691-0a4b-48b6-b1e1-151e1cac847c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.614880 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhww6\" (UniqueName: \"kubernetes.io/projected/0b81c691-0a4b-48b6-b1e1-151e1cac847c-kube-api-access-rhww6\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2\" (UID: \"0b81c691-0a4b-48b6-b1e1-151e1cac847c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.716343 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b81c691-0a4b-48b6-b1e1-151e1cac847c-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2\" (UID: \"0b81c691-0a4b-48b6-b1e1-151e1cac847c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.716463 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhww6\" (UniqueName: \"kubernetes.io/projected/0b81c691-0a4b-48b6-b1e1-151e1cac847c-kube-api-access-rhww6\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2\" (UID: \"0b81c691-0a4b-48b6-b1e1-151e1cac847c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.716526 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b81c691-0a4b-48b6-b1e1-151e1cac847c-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2\" (UID: \"0b81c691-0a4b-48b6-b1e1-151e1cac847c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.716562 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b81c691-0a4b-48b6-b1e1-151e1cac847c-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2\" (UID: \"0b81c691-0a4b-48b6-b1e1-151e1cac847c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.721723 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b81c691-0a4b-48b6-b1e1-151e1cac847c-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2\" (UID: \"0b81c691-0a4b-48b6-b1e1-151e1cac847c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.721953 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b81c691-0a4b-48b6-b1e1-151e1cac847c-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2\" (UID: \"0b81c691-0a4b-48b6-b1e1-151e1cac847c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.721983 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b81c691-0a4b-48b6-b1e1-151e1cac847c-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2\" (UID: \"0b81c691-0a4b-48b6-b1e1-151e1cac847c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.732262 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhww6\" (UniqueName: \"kubernetes.io/projected/0b81c691-0a4b-48b6-b1e1-151e1cac847c-kube-api-access-rhww6\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2\" (UID: \"0b81c691-0a4b-48b6-b1e1-151e1cac847c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2" Dec 01 20:37:17 crc kubenswrapper[4802]: I1201 20:37:17.828798 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2" Dec 01 20:37:18 crc kubenswrapper[4802]: I1201 20:37:18.396248 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2"] Dec 01 20:37:18 crc kubenswrapper[4802]: I1201 20:37:18.432391 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2" event={"ID":"0b81c691-0a4b-48b6-b1e1-151e1cac847c","Type":"ContainerStarted","Data":"59ec2b35c4c410ddb0130e3cb02f73e606e89b6ea5519e57e214561fc23f5743"} Dec 01 20:37:19 crc kubenswrapper[4802]: I1201 20:37:19.211847 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 20:37:19 crc kubenswrapper[4802]: I1201 20:37:19.457234 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2" event={"ID":"0b81c691-0a4b-48b6-b1e1-151e1cac847c","Type":"ContainerStarted","Data":"b29493fd715961dec2650fd61d3b4d495c49fd9a330f93a1047b3175e47ac524"} Dec 01 20:37:19 crc kubenswrapper[4802]: I1201 20:37:19.480358 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2" podStartSLOduration=1.6716695750000001 podStartE2EDuration="2.480340636s" podCreationTimestamp="2025-12-01 20:37:17 +0000 UTC" firstStartedPulling="2025-12-01 20:37:18.400367581 +0000 UTC m=+2459.962927222" lastFinishedPulling="2025-12-01 20:37:19.209038602 +0000 UTC m=+2460.771598283" observedRunningTime="2025-12-01 20:37:19.473762221 +0000 UTC m=+2461.036321862" watchObservedRunningTime="2025-12-01 20:37:19.480340636 +0000 UTC m=+2461.042900277" Dec 01 20:37:23 crc kubenswrapper[4802]: I1201 20:37:23.490171 4802 generic.go:334] "Generic (PLEG): container finished" podID="0b81c691-0a4b-48b6-b1e1-151e1cac847c" containerID="b29493fd715961dec2650fd61d3b4d495c49fd9a330f93a1047b3175e47ac524" exitCode=0 Dec 01 20:37:23 crc kubenswrapper[4802]: I1201 20:37:23.490236 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2" event={"ID":"0b81c691-0a4b-48b6-b1e1-151e1cac847c","Type":"ContainerDied","Data":"b29493fd715961dec2650fd61d3b4d495c49fd9a330f93a1047b3175e47ac524"} Dec 01 20:37:23 crc kubenswrapper[4802]: I1201 20:37:23.720168 4802 scope.go:117] "RemoveContainer" containerID="e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88" Dec 01 20:37:23 crc kubenswrapper[4802]: E1201 20:37:23.720509 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:37:24 crc kubenswrapper[4802]: I1201 20:37:24.930870 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.059000 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b81c691-0a4b-48b6-b1e1-151e1cac847c-ssh-key\") pod \"0b81c691-0a4b-48b6-b1e1-151e1cac847c\" (UID: \"0b81c691-0a4b-48b6-b1e1-151e1cac847c\") " Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.059301 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b81c691-0a4b-48b6-b1e1-151e1cac847c-inventory\") pod \"0b81c691-0a4b-48b6-b1e1-151e1cac847c\" (UID: \"0b81c691-0a4b-48b6-b1e1-151e1cac847c\") " Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.059364 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhww6\" (UniqueName: \"kubernetes.io/projected/0b81c691-0a4b-48b6-b1e1-151e1cac847c-kube-api-access-rhww6\") pod \"0b81c691-0a4b-48b6-b1e1-151e1cac847c\" (UID: \"0b81c691-0a4b-48b6-b1e1-151e1cac847c\") " Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.059478 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b81c691-0a4b-48b6-b1e1-151e1cac847c-ceph\") pod \"0b81c691-0a4b-48b6-b1e1-151e1cac847c\" (UID: \"0b81c691-0a4b-48b6-b1e1-151e1cac847c\") " Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.065088 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b81c691-0a4b-48b6-b1e1-151e1cac847c-kube-api-access-rhww6" (OuterVolumeSpecName: "kube-api-access-rhww6") pod "0b81c691-0a4b-48b6-b1e1-151e1cac847c" (UID: "0b81c691-0a4b-48b6-b1e1-151e1cac847c"). InnerVolumeSpecName "kube-api-access-rhww6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.065230 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b81c691-0a4b-48b6-b1e1-151e1cac847c-ceph" (OuterVolumeSpecName: "ceph") pod "0b81c691-0a4b-48b6-b1e1-151e1cac847c" (UID: "0b81c691-0a4b-48b6-b1e1-151e1cac847c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.084658 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b81c691-0a4b-48b6-b1e1-151e1cac847c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0b81c691-0a4b-48b6-b1e1-151e1cac847c" (UID: "0b81c691-0a4b-48b6-b1e1-151e1cac847c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.084980 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b81c691-0a4b-48b6-b1e1-151e1cac847c-inventory" (OuterVolumeSpecName: "inventory") pod "0b81c691-0a4b-48b6-b1e1-151e1cac847c" (UID: "0b81c691-0a4b-48b6-b1e1-151e1cac847c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.161711 4802 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b81c691-0a4b-48b6-b1e1-151e1cac847c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.161739 4802 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b81c691-0a4b-48b6-b1e1-151e1cac847c-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.161749 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhww6\" (UniqueName: \"kubernetes.io/projected/0b81c691-0a4b-48b6-b1e1-151e1cac847c-kube-api-access-rhww6\") on node \"crc\" DevicePath \"\"" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.161758 4802 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b81c691-0a4b-48b6-b1e1-151e1cac847c-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.506250 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2" event={"ID":"0b81c691-0a4b-48b6-b1e1-151e1cac847c","Type":"ContainerDied","Data":"59ec2b35c4c410ddb0130e3cb02f73e606e89b6ea5519e57e214561fc23f5743"} Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.506298 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59ec2b35c4c410ddb0130e3cb02f73e606e89b6ea5519e57e214561fc23f5743" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.506310 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.589854 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz"] Dec 01 20:37:25 crc kubenswrapper[4802]: E1201 20:37:25.590253 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b81c691-0a4b-48b6-b1e1-151e1cac847c" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.590274 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b81c691-0a4b-48b6-b1e1-151e1cac847c" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.590506 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b81c691-0a4b-48b6-b1e1-151e1cac847c" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.591141 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.593636 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.593730 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.595750 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wx8v9" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.595892 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.595995 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.600287 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz"] Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.774055 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a9ae28b-9e09-4918-b72b-e22abd2e6dec-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz\" (UID: \"5a9ae28b-9e09-4918-b72b-e22abd2e6dec\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.774144 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a9ae28b-9e09-4918-b72b-e22abd2e6dec-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz\" (UID: \"5a9ae28b-9e09-4918-b72b-e22abd2e6dec\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.774186 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a9ae28b-9e09-4918-b72b-e22abd2e6dec-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz\" (UID: \"5a9ae28b-9e09-4918-b72b-e22abd2e6dec\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.774234 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckkm9\" (UniqueName: \"kubernetes.io/projected/5a9ae28b-9e09-4918-b72b-e22abd2e6dec-kube-api-access-ckkm9\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz\" (UID: \"5a9ae28b-9e09-4918-b72b-e22abd2e6dec\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.876349 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a9ae28b-9e09-4918-b72b-e22abd2e6dec-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz\" (UID: \"5a9ae28b-9e09-4918-b72b-e22abd2e6dec\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.876420 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a9ae28b-9e09-4918-b72b-e22abd2e6dec-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz\" (UID: \"5a9ae28b-9e09-4918-b72b-e22abd2e6dec\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.876455 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckkm9\" (UniqueName: \"kubernetes.io/projected/5a9ae28b-9e09-4918-b72b-e22abd2e6dec-kube-api-access-ckkm9\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz\" (UID: \"5a9ae28b-9e09-4918-b72b-e22abd2e6dec\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.876788 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a9ae28b-9e09-4918-b72b-e22abd2e6dec-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz\" (UID: \"5a9ae28b-9e09-4918-b72b-e22abd2e6dec\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.882942 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a9ae28b-9e09-4918-b72b-e22abd2e6dec-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz\" (UID: \"5a9ae28b-9e09-4918-b72b-e22abd2e6dec\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.886939 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a9ae28b-9e09-4918-b72b-e22abd2e6dec-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz\" (UID: \"5a9ae28b-9e09-4918-b72b-e22abd2e6dec\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.887546 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a9ae28b-9e09-4918-b72b-e22abd2e6dec-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz\" (UID: \"5a9ae28b-9e09-4918-b72b-e22abd2e6dec\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.894613 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckkm9\" (UniqueName: \"kubernetes.io/projected/5a9ae28b-9e09-4918-b72b-e22abd2e6dec-kube-api-access-ckkm9\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz\" (UID: \"5a9ae28b-9e09-4918-b72b-e22abd2e6dec\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz" Dec 01 20:37:25 crc kubenswrapper[4802]: I1201 20:37:25.913147 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz" Dec 01 20:37:26 crc kubenswrapper[4802]: I1201 20:37:26.426291 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz"] Dec 01 20:37:26 crc kubenswrapper[4802]: I1201 20:37:26.517879 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz" event={"ID":"5a9ae28b-9e09-4918-b72b-e22abd2e6dec","Type":"ContainerStarted","Data":"fe733503eaeebdd2cdece67490f10b460fe5e30cbe528df8d9e8d1d6657f8bc9"} Dec 01 20:37:27 crc kubenswrapper[4802]: I1201 20:37:27.529071 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz" event={"ID":"5a9ae28b-9e09-4918-b72b-e22abd2e6dec","Type":"ContainerStarted","Data":"ed1f0bd1901e8d2fe3965ee5c3ed63acc2b1cd903740b547c35baa23dfa539e7"} Dec 01 20:37:27 crc kubenswrapper[4802]: I1201 20:37:27.545520 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz" podStartSLOduration=1.999790798 podStartE2EDuration="2.545499335s" podCreationTimestamp="2025-12-01 20:37:25 +0000 UTC" firstStartedPulling="2025-12-01 20:37:26.438480814 +0000 UTC m=+2468.001040455" lastFinishedPulling="2025-12-01 20:37:26.984189351 +0000 UTC m=+2468.546748992" observedRunningTime="2025-12-01 20:37:27.542546872 +0000 UTC m=+2469.105106513" watchObservedRunningTime="2025-12-01 20:37:27.545499335 +0000 UTC m=+2469.108058986" Dec 01 20:37:34 crc kubenswrapper[4802]: I1201 20:37:34.720860 4802 scope.go:117] "RemoveContainer" containerID="e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88" Dec 01 20:37:34 crc kubenswrapper[4802]: E1201 20:37:34.721893 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:37:45 crc kubenswrapper[4802]: I1201 20:37:45.720888 4802 scope.go:117] "RemoveContainer" containerID="e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88" Dec 01 20:37:45 crc kubenswrapper[4802]: E1201 20:37:45.721759 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:37:56 crc kubenswrapper[4802]: I1201 20:37:56.729660 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w8c7f"] Dec 01 20:37:56 crc kubenswrapper[4802]: I1201 20:37:56.731913 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w8c7f"] Dec 01 20:37:56 crc kubenswrapper[4802]: I1201 20:37:56.732006 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w8c7f" Dec 01 20:37:56 crc kubenswrapper[4802]: I1201 20:37:56.926494 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhhfz\" (UniqueName: \"kubernetes.io/projected/aa67ce39-3813-47cd-9f8a-d73797769264-kube-api-access-fhhfz\") pod \"redhat-marketplace-w8c7f\" (UID: \"aa67ce39-3813-47cd-9f8a-d73797769264\") " pod="openshift-marketplace/redhat-marketplace-w8c7f" Dec 01 20:37:56 crc kubenswrapper[4802]: I1201 20:37:56.927150 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa67ce39-3813-47cd-9f8a-d73797769264-catalog-content\") pod \"redhat-marketplace-w8c7f\" (UID: \"aa67ce39-3813-47cd-9f8a-d73797769264\") " pod="openshift-marketplace/redhat-marketplace-w8c7f" Dec 01 20:37:56 crc kubenswrapper[4802]: I1201 20:37:56.927333 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa67ce39-3813-47cd-9f8a-d73797769264-utilities\") pod \"redhat-marketplace-w8c7f\" (UID: \"aa67ce39-3813-47cd-9f8a-d73797769264\") " pod="openshift-marketplace/redhat-marketplace-w8c7f" Dec 01 20:37:57 crc kubenswrapper[4802]: I1201 20:37:57.029804 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa67ce39-3813-47cd-9f8a-d73797769264-catalog-content\") pod \"redhat-marketplace-w8c7f\" (UID: \"aa67ce39-3813-47cd-9f8a-d73797769264\") " pod="openshift-marketplace/redhat-marketplace-w8c7f" Dec 01 20:37:57 crc kubenswrapper[4802]: I1201 20:37:57.029888 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa67ce39-3813-47cd-9f8a-d73797769264-utilities\") pod \"redhat-marketplace-w8c7f\" (UID: \"aa67ce39-3813-47cd-9f8a-d73797769264\") " pod="openshift-marketplace/redhat-marketplace-w8c7f" Dec 01 20:37:57 crc kubenswrapper[4802]: I1201 20:37:57.029946 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhhfz\" (UniqueName: \"kubernetes.io/projected/aa67ce39-3813-47cd-9f8a-d73797769264-kube-api-access-fhhfz\") pod \"redhat-marketplace-w8c7f\" (UID: \"aa67ce39-3813-47cd-9f8a-d73797769264\") " pod="openshift-marketplace/redhat-marketplace-w8c7f" Dec 01 20:37:57 crc kubenswrapper[4802]: I1201 20:37:57.030703 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa67ce39-3813-47cd-9f8a-d73797769264-catalog-content\") pod \"redhat-marketplace-w8c7f\" (UID: \"aa67ce39-3813-47cd-9f8a-d73797769264\") " pod="openshift-marketplace/redhat-marketplace-w8c7f" Dec 01 20:37:57 crc kubenswrapper[4802]: I1201 20:37:57.030803 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa67ce39-3813-47cd-9f8a-d73797769264-utilities\") pod \"redhat-marketplace-w8c7f\" (UID: \"aa67ce39-3813-47cd-9f8a-d73797769264\") " pod="openshift-marketplace/redhat-marketplace-w8c7f" Dec 01 20:37:57 crc kubenswrapper[4802]: I1201 20:37:57.067371 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhhfz\" (UniqueName: \"kubernetes.io/projected/aa67ce39-3813-47cd-9f8a-d73797769264-kube-api-access-fhhfz\") pod \"redhat-marketplace-w8c7f\" (UID: \"aa67ce39-3813-47cd-9f8a-d73797769264\") " pod="openshift-marketplace/redhat-marketplace-w8c7f" Dec 01 20:37:57 crc kubenswrapper[4802]: I1201 20:37:57.362065 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w8c7f" Dec 01 20:37:57 crc kubenswrapper[4802]: I1201 20:37:57.774961 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w8c7f"] Dec 01 20:37:57 crc kubenswrapper[4802]: I1201 20:37:57.798007 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w8c7f" event={"ID":"aa67ce39-3813-47cd-9f8a-d73797769264","Type":"ContainerStarted","Data":"49a81eb6882777f88cf71de798480f18d10e29281cec20045ec40f348c890db5"} Dec 01 20:37:58 crc kubenswrapper[4802]: E1201 20:37:58.083564 4802 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa67ce39_3813_47cd_9f8a_d73797769264.slice/crio-conmon-8eefd53903939ae8c7f5d5460272d2db9b6c77c5e65f1ae1106b3462e7d239e3.scope\": RecentStats: unable to find data in memory cache]" Dec 01 20:37:58 crc kubenswrapper[4802]: I1201 20:37:58.806140 4802 generic.go:334] "Generic (PLEG): container finished" podID="aa67ce39-3813-47cd-9f8a-d73797769264" containerID="8eefd53903939ae8c7f5d5460272d2db9b6c77c5e65f1ae1106b3462e7d239e3" exitCode=0 Dec 01 20:37:58 crc kubenswrapper[4802]: I1201 20:37:58.806242 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w8c7f" event={"ID":"aa67ce39-3813-47cd-9f8a-d73797769264","Type":"ContainerDied","Data":"8eefd53903939ae8c7f5d5460272d2db9b6c77c5e65f1ae1106b3462e7d239e3"} Dec 01 20:38:00 crc kubenswrapper[4802]: I1201 20:38:00.720186 4802 scope.go:117] "RemoveContainer" containerID="e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88" Dec 01 20:38:00 crc kubenswrapper[4802]: E1201 20:38:00.721289 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:38:00 crc kubenswrapper[4802]: I1201 20:38:00.823725 4802 generic.go:334] "Generic (PLEG): container finished" podID="aa67ce39-3813-47cd-9f8a-d73797769264" containerID="5d93624ad1392c1a220e9b1fbec865f9e565a8e0530471d6c0ba100d02aba229" exitCode=0 Dec 01 20:38:00 crc kubenswrapper[4802]: I1201 20:38:00.823767 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w8c7f" event={"ID":"aa67ce39-3813-47cd-9f8a-d73797769264","Type":"ContainerDied","Data":"5d93624ad1392c1a220e9b1fbec865f9e565a8e0530471d6c0ba100d02aba229"} Dec 01 20:38:01 crc kubenswrapper[4802]: I1201 20:38:01.832724 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w8c7f" event={"ID":"aa67ce39-3813-47cd-9f8a-d73797769264","Type":"ContainerStarted","Data":"5302ab32b04f8b8fc3337a8102cba8ae853199dbbeab167926b1dc9a5818ac58"} Dec 01 20:38:01 crc kubenswrapper[4802]: I1201 20:38:01.860492 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w8c7f" podStartSLOduration=3.407958151 podStartE2EDuration="5.860474252s" podCreationTimestamp="2025-12-01 20:37:56 +0000 UTC" firstStartedPulling="2025-12-01 20:37:58.808260057 +0000 UTC m=+2500.370819698" lastFinishedPulling="2025-12-01 20:38:01.260776158 +0000 UTC m=+2502.823335799" observedRunningTime="2025-12-01 20:38:01.854750843 +0000 UTC m=+2503.417310484" watchObservedRunningTime="2025-12-01 20:38:01.860474252 +0000 UTC m=+2503.423033893" Dec 01 20:38:07 crc kubenswrapper[4802]: I1201 20:38:07.362327 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w8c7f" Dec 01 20:38:07 crc kubenswrapper[4802]: I1201 20:38:07.362886 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w8c7f" Dec 01 20:38:07 crc kubenswrapper[4802]: I1201 20:38:07.410117 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w8c7f" Dec 01 20:38:07 crc kubenswrapper[4802]: I1201 20:38:07.931462 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w8c7f" Dec 01 20:38:08 crc kubenswrapper[4802]: I1201 20:38:08.901287 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w8c7f"] Dec 01 20:38:09 crc kubenswrapper[4802]: I1201 20:38:09.903793 4802 generic.go:334] "Generic (PLEG): container finished" podID="5a9ae28b-9e09-4918-b72b-e22abd2e6dec" containerID="ed1f0bd1901e8d2fe3965ee5c3ed63acc2b1cd903740b547c35baa23dfa539e7" exitCode=0 Dec 01 20:38:09 crc kubenswrapper[4802]: I1201 20:38:09.903872 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz" event={"ID":"5a9ae28b-9e09-4918-b72b-e22abd2e6dec","Type":"ContainerDied","Data":"ed1f0bd1901e8d2fe3965ee5c3ed63acc2b1cd903740b547c35baa23dfa539e7"} Dec 01 20:38:09 crc kubenswrapper[4802]: I1201 20:38:09.903990 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w8c7f" podUID="aa67ce39-3813-47cd-9f8a-d73797769264" containerName="registry-server" containerID="cri-o://5302ab32b04f8b8fc3337a8102cba8ae853199dbbeab167926b1dc9a5818ac58" gracePeriod=2 Dec 01 20:38:10 crc kubenswrapper[4802]: I1201 20:38:10.358914 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w8c7f" Dec 01 20:38:10 crc kubenswrapper[4802]: I1201 20:38:10.492269 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhhfz\" (UniqueName: \"kubernetes.io/projected/aa67ce39-3813-47cd-9f8a-d73797769264-kube-api-access-fhhfz\") pod \"aa67ce39-3813-47cd-9f8a-d73797769264\" (UID: \"aa67ce39-3813-47cd-9f8a-d73797769264\") " Dec 01 20:38:10 crc kubenswrapper[4802]: I1201 20:38:10.492378 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa67ce39-3813-47cd-9f8a-d73797769264-utilities\") pod \"aa67ce39-3813-47cd-9f8a-d73797769264\" (UID: \"aa67ce39-3813-47cd-9f8a-d73797769264\") " Dec 01 20:38:10 crc kubenswrapper[4802]: I1201 20:38:10.492659 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa67ce39-3813-47cd-9f8a-d73797769264-catalog-content\") pod \"aa67ce39-3813-47cd-9f8a-d73797769264\" (UID: \"aa67ce39-3813-47cd-9f8a-d73797769264\") " Dec 01 20:38:10 crc kubenswrapper[4802]: I1201 20:38:10.493260 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa67ce39-3813-47cd-9f8a-d73797769264-utilities" (OuterVolumeSpecName: "utilities") pod "aa67ce39-3813-47cd-9f8a-d73797769264" (UID: "aa67ce39-3813-47cd-9f8a-d73797769264"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:38:10 crc kubenswrapper[4802]: I1201 20:38:10.497422 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa67ce39-3813-47cd-9f8a-d73797769264-kube-api-access-fhhfz" (OuterVolumeSpecName: "kube-api-access-fhhfz") pod "aa67ce39-3813-47cd-9f8a-d73797769264" (UID: "aa67ce39-3813-47cd-9f8a-d73797769264"). InnerVolumeSpecName "kube-api-access-fhhfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:38:10 crc kubenswrapper[4802]: I1201 20:38:10.527800 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa67ce39-3813-47cd-9f8a-d73797769264-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa67ce39-3813-47cd-9f8a-d73797769264" (UID: "aa67ce39-3813-47cd-9f8a-d73797769264"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:38:10 crc kubenswrapper[4802]: I1201 20:38:10.594499 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa67ce39-3813-47cd-9f8a-d73797769264-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 20:38:10 crc kubenswrapper[4802]: I1201 20:38:10.594535 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhhfz\" (UniqueName: \"kubernetes.io/projected/aa67ce39-3813-47cd-9f8a-d73797769264-kube-api-access-fhhfz\") on node \"crc\" DevicePath \"\"" Dec 01 20:38:10 crc kubenswrapper[4802]: I1201 20:38:10.594547 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa67ce39-3813-47cd-9f8a-d73797769264-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 20:38:10 crc kubenswrapper[4802]: I1201 20:38:10.916025 4802 generic.go:334] "Generic (PLEG): container finished" podID="aa67ce39-3813-47cd-9f8a-d73797769264" containerID="5302ab32b04f8b8fc3337a8102cba8ae853199dbbeab167926b1dc9a5818ac58" exitCode=0 Dec 01 20:38:10 crc kubenswrapper[4802]: I1201 20:38:10.916093 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w8c7f" event={"ID":"aa67ce39-3813-47cd-9f8a-d73797769264","Type":"ContainerDied","Data":"5302ab32b04f8b8fc3337a8102cba8ae853199dbbeab167926b1dc9a5818ac58"} Dec 01 20:38:10 crc kubenswrapper[4802]: I1201 20:38:10.916131 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w8c7f" Dec 01 20:38:10 crc kubenswrapper[4802]: I1201 20:38:10.916158 4802 scope.go:117] "RemoveContainer" containerID="5302ab32b04f8b8fc3337a8102cba8ae853199dbbeab167926b1dc9a5818ac58" Dec 01 20:38:10 crc kubenswrapper[4802]: I1201 20:38:10.916146 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w8c7f" event={"ID":"aa67ce39-3813-47cd-9f8a-d73797769264","Type":"ContainerDied","Data":"49a81eb6882777f88cf71de798480f18d10e29281cec20045ec40f348c890db5"} Dec 01 20:38:10 crc kubenswrapper[4802]: I1201 20:38:10.944235 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w8c7f"] Dec 01 20:38:10 crc kubenswrapper[4802]: I1201 20:38:10.951074 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w8c7f"] Dec 01 20:38:10 crc kubenswrapper[4802]: I1201 20:38:10.957308 4802 scope.go:117] "RemoveContainer" containerID="5d93624ad1392c1a220e9b1fbec865f9e565a8e0530471d6c0ba100d02aba229" Dec 01 20:38:11 crc kubenswrapper[4802]: I1201 20:38:11.001907 4802 scope.go:117] "RemoveContainer" containerID="8eefd53903939ae8c7f5d5460272d2db9b6c77c5e65f1ae1106b3462e7d239e3" Dec 01 20:38:11 crc kubenswrapper[4802]: I1201 20:38:11.024511 4802 scope.go:117] "RemoveContainer" containerID="5302ab32b04f8b8fc3337a8102cba8ae853199dbbeab167926b1dc9a5818ac58" Dec 01 20:38:11 crc kubenswrapper[4802]: E1201 20:38:11.025049 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5302ab32b04f8b8fc3337a8102cba8ae853199dbbeab167926b1dc9a5818ac58\": container with ID starting with 5302ab32b04f8b8fc3337a8102cba8ae853199dbbeab167926b1dc9a5818ac58 not found: ID does not exist" containerID="5302ab32b04f8b8fc3337a8102cba8ae853199dbbeab167926b1dc9a5818ac58" Dec 01 20:38:11 crc kubenswrapper[4802]: I1201 20:38:11.025079 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5302ab32b04f8b8fc3337a8102cba8ae853199dbbeab167926b1dc9a5818ac58"} err="failed to get container status \"5302ab32b04f8b8fc3337a8102cba8ae853199dbbeab167926b1dc9a5818ac58\": rpc error: code = NotFound desc = could not find container \"5302ab32b04f8b8fc3337a8102cba8ae853199dbbeab167926b1dc9a5818ac58\": container with ID starting with 5302ab32b04f8b8fc3337a8102cba8ae853199dbbeab167926b1dc9a5818ac58 not found: ID does not exist" Dec 01 20:38:11 crc kubenswrapper[4802]: I1201 20:38:11.025103 4802 scope.go:117] "RemoveContainer" containerID="5d93624ad1392c1a220e9b1fbec865f9e565a8e0530471d6c0ba100d02aba229" Dec 01 20:38:11 crc kubenswrapper[4802]: E1201 20:38:11.025396 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d93624ad1392c1a220e9b1fbec865f9e565a8e0530471d6c0ba100d02aba229\": container with ID starting with 5d93624ad1392c1a220e9b1fbec865f9e565a8e0530471d6c0ba100d02aba229 not found: ID does not exist" containerID="5d93624ad1392c1a220e9b1fbec865f9e565a8e0530471d6c0ba100d02aba229" Dec 01 20:38:11 crc kubenswrapper[4802]: I1201 20:38:11.025418 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d93624ad1392c1a220e9b1fbec865f9e565a8e0530471d6c0ba100d02aba229"} err="failed to get container status \"5d93624ad1392c1a220e9b1fbec865f9e565a8e0530471d6c0ba100d02aba229\": rpc error: code = NotFound desc = could not find container \"5d93624ad1392c1a220e9b1fbec865f9e565a8e0530471d6c0ba100d02aba229\": container with ID starting with 5d93624ad1392c1a220e9b1fbec865f9e565a8e0530471d6c0ba100d02aba229 not found: ID does not exist" Dec 01 20:38:11 crc kubenswrapper[4802]: I1201 20:38:11.025431 4802 scope.go:117] "RemoveContainer" containerID="8eefd53903939ae8c7f5d5460272d2db9b6c77c5e65f1ae1106b3462e7d239e3" Dec 01 20:38:11 crc kubenswrapper[4802]: E1201 20:38:11.025658 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8eefd53903939ae8c7f5d5460272d2db9b6c77c5e65f1ae1106b3462e7d239e3\": container with ID starting with 8eefd53903939ae8c7f5d5460272d2db9b6c77c5e65f1ae1106b3462e7d239e3 not found: ID does not exist" containerID="8eefd53903939ae8c7f5d5460272d2db9b6c77c5e65f1ae1106b3462e7d239e3" Dec 01 20:38:11 crc kubenswrapper[4802]: I1201 20:38:11.025681 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eefd53903939ae8c7f5d5460272d2db9b6c77c5e65f1ae1106b3462e7d239e3"} err="failed to get container status \"8eefd53903939ae8c7f5d5460272d2db9b6c77c5e65f1ae1106b3462e7d239e3\": rpc error: code = NotFound desc = could not find container \"8eefd53903939ae8c7f5d5460272d2db9b6c77c5e65f1ae1106b3462e7d239e3\": container with ID starting with 8eefd53903939ae8c7f5d5460272d2db9b6c77c5e65f1ae1106b3462e7d239e3 not found: ID does not exist" Dec 01 20:38:11 crc kubenswrapper[4802]: I1201 20:38:11.321924 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz" Dec 01 20:38:11 crc kubenswrapper[4802]: I1201 20:38:11.513713 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a9ae28b-9e09-4918-b72b-e22abd2e6dec-inventory\") pod \"5a9ae28b-9e09-4918-b72b-e22abd2e6dec\" (UID: \"5a9ae28b-9e09-4918-b72b-e22abd2e6dec\") " Dec 01 20:38:11 crc kubenswrapper[4802]: I1201 20:38:11.514270 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a9ae28b-9e09-4918-b72b-e22abd2e6dec-ceph\") pod \"5a9ae28b-9e09-4918-b72b-e22abd2e6dec\" (UID: \"5a9ae28b-9e09-4918-b72b-e22abd2e6dec\") " Dec 01 20:38:11 crc kubenswrapper[4802]: I1201 20:38:11.514434 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a9ae28b-9e09-4918-b72b-e22abd2e6dec-ssh-key\") pod \"5a9ae28b-9e09-4918-b72b-e22abd2e6dec\" (UID: \"5a9ae28b-9e09-4918-b72b-e22abd2e6dec\") " Dec 01 20:38:11 crc kubenswrapper[4802]: I1201 20:38:11.514471 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckkm9\" (UniqueName: \"kubernetes.io/projected/5a9ae28b-9e09-4918-b72b-e22abd2e6dec-kube-api-access-ckkm9\") pod \"5a9ae28b-9e09-4918-b72b-e22abd2e6dec\" (UID: \"5a9ae28b-9e09-4918-b72b-e22abd2e6dec\") " Dec 01 20:38:11 crc kubenswrapper[4802]: I1201 20:38:11.519414 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a9ae28b-9e09-4918-b72b-e22abd2e6dec-ceph" (OuterVolumeSpecName: "ceph") pod "5a9ae28b-9e09-4918-b72b-e22abd2e6dec" (UID: "5a9ae28b-9e09-4918-b72b-e22abd2e6dec"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:38:11 crc kubenswrapper[4802]: I1201 20:38:11.519419 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a9ae28b-9e09-4918-b72b-e22abd2e6dec-kube-api-access-ckkm9" (OuterVolumeSpecName: "kube-api-access-ckkm9") pod "5a9ae28b-9e09-4918-b72b-e22abd2e6dec" (UID: "5a9ae28b-9e09-4918-b72b-e22abd2e6dec"). InnerVolumeSpecName "kube-api-access-ckkm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:38:11 crc kubenswrapper[4802]: I1201 20:38:11.544728 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a9ae28b-9e09-4918-b72b-e22abd2e6dec-inventory" (OuterVolumeSpecName: "inventory") pod "5a9ae28b-9e09-4918-b72b-e22abd2e6dec" (UID: "5a9ae28b-9e09-4918-b72b-e22abd2e6dec"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:38:11 crc kubenswrapper[4802]: I1201 20:38:11.547221 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a9ae28b-9e09-4918-b72b-e22abd2e6dec-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5a9ae28b-9e09-4918-b72b-e22abd2e6dec" (UID: "5a9ae28b-9e09-4918-b72b-e22abd2e6dec"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:38:11 crc kubenswrapper[4802]: I1201 20:38:11.625947 4802 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a9ae28b-9e09-4918-b72b-e22abd2e6dec-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 20:38:11 crc kubenswrapper[4802]: I1201 20:38:11.625978 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckkm9\" (UniqueName: \"kubernetes.io/projected/5a9ae28b-9e09-4918-b72b-e22abd2e6dec-kube-api-access-ckkm9\") on node \"crc\" DevicePath \"\"" Dec 01 20:38:11 crc kubenswrapper[4802]: I1201 20:38:11.625990 4802 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a9ae28b-9e09-4918-b72b-e22abd2e6dec-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 20:38:11 crc kubenswrapper[4802]: I1201 20:38:11.625999 4802 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a9ae28b-9e09-4918-b72b-e22abd2e6dec-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 20:38:11 crc kubenswrapper[4802]: I1201 20:38:11.928641 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz" event={"ID":"5a9ae28b-9e09-4918-b72b-e22abd2e6dec","Type":"ContainerDied","Data":"fe733503eaeebdd2cdece67490f10b460fe5e30cbe528df8d9e8d1d6657f8bc9"} Dec 01 20:38:11 crc kubenswrapper[4802]: I1201 20:38:11.928995 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe733503eaeebdd2cdece67490f10b460fe5e30cbe528df8d9e8d1d6657f8bc9" Dec 01 20:38:11 crc kubenswrapper[4802]: I1201 20:38:11.928677 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz" Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.017344 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-zqvg6"] Dec 01 20:38:12 crc kubenswrapper[4802]: E1201 20:38:12.017795 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a9ae28b-9e09-4918-b72b-e22abd2e6dec" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.017815 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a9ae28b-9e09-4918-b72b-e22abd2e6dec" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 20:38:12 crc kubenswrapper[4802]: E1201 20:38:12.017837 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa67ce39-3813-47cd-9f8a-d73797769264" containerName="registry-server" Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.017846 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa67ce39-3813-47cd-9f8a-d73797769264" containerName="registry-server" Dec 01 20:38:12 crc kubenswrapper[4802]: E1201 20:38:12.017861 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa67ce39-3813-47cd-9f8a-d73797769264" containerName="extract-content" Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.017868 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa67ce39-3813-47cd-9f8a-d73797769264" containerName="extract-content" Dec 01 20:38:12 crc kubenswrapper[4802]: E1201 20:38:12.017896 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa67ce39-3813-47cd-9f8a-d73797769264" containerName="extract-utilities" Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.017903 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa67ce39-3813-47cd-9f8a-d73797769264" containerName="extract-utilities" Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.018088 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a9ae28b-9e09-4918-b72b-e22abd2e6dec" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.018105 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa67ce39-3813-47cd-9f8a-d73797769264" containerName="registry-server" Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.018860 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-zqvg6" Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.021600 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.021872 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.022035 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.022252 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.023395 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wx8v9" Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.032020 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/73a1d762-0b3b-4abf-9072-0b5bdba7bd72-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-zqvg6\" (UID: \"73a1d762-0b3b-4abf-9072-0b5bdba7bd72\") " pod="openstack/ssh-known-hosts-edpm-deployment-zqvg6" Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.032834 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/73a1d762-0b3b-4abf-9072-0b5bdba7bd72-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-zqvg6\" (UID: \"73a1d762-0b3b-4abf-9072-0b5bdba7bd72\") " pod="openstack/ssh-known-hosts-edpm-deployment-zqvg6" Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.032958 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktbmn\" (UniqueName: \"kubernetes.io/projected/73a1d762-0b3b-4abf-9072-0b5bdba7bd72-kube-api-access-ktbmn\") pod \"ssh-known-hosts-edpm-deployment-zqvg6\" (UID: \"73a1d762-0b3b-4abf-9072-0b5bdba7bd72\") " pod="openstack/ssh-known-hosts-edpm-deployment-zqvg6" Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.033112 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73a1d762-0b3b-4abf-9072-0b5bdba7bd72-ceph\") pod \"ssh-known-hosts-edpm-deployment-zqvg6\" (UID: \"73a1d762-0b3b-4abf-9072-0b5bdba7bd72\") " pod="openstack/ssh-known-hosts-edpm-deployment-zqvg6" Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.044838 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-zqvg6"] Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.133889 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73a1d762-0b3b-4abf-9072-0b5bdba7bd72-ceph\") pod \"ssh-known-hosts-edpm-deployment-zqvg6\" (UID: \"73a1d762-0b3b-4abf-9072-0b5bdba7bd72\") " pod="openstack/ssh-known-hosts-edpm-deployment-zqvg6" Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.133958 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/73a1d762-0b3b-4abf-9072-0b5bdba7bd72-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-zqvg6\" (UID: \"73a1d762-0b3b-4abf-9072-0b5bdba7bd72\") " pod="openstack/ssh-known-hosts-edpm-deployment-zqvg6" Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.133996 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/73a1d762-0b3b-4abf-9072-0b5bdba7bd72-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-zqvg6\" (UID: \"73a1d762-0b3b-4abf-9072-0b5bdba7bd72\") " pod="openstack/ssh-known-hosts-edpm-deployment-zqvg6" Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.134054 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktbmn\" (UniqueName: \"kubernetes.io/projected/73a1d762-0b3b-4abf-9072-0b5bdba7bd72-kube-api-access-ktbmn\") pod \"ssh-known-hosts-edpm-deployment-zqvg6\" (UID: \"73a1d762-0b3b-4abf-9072-0b5bdba7bd72\") " pod="openstack/ssh-known-hosts-edpm-deployment-zqvg6" Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.138366 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73a1d762-0b3b-4abf-9072-0b5bdba7bd72-ceph\") pod \"ssh-known-hosts-edpm-deployment-zqvg6\" (UID: \"73a1d762-0b3b-4abf-9072-0b5bdba7bd72\") " pod="openstack/ssh-known-hosts-edpm-deployment-zqvg6" Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.138707 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/73a1d762-0b3b-4abf-9072-0b5bdba7bd72-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-zqvg6\" (UID: \"73a1d762-0b3b-4abf-9072-0b5bdba7bd72\") " pod="openstack/ssh-known-hosts-edpm-deployment-zqvg6" Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.146480 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/73a1d762-0b3b-4abf-9072-0b5bdba7bd72-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-zqvg6\" (UID: \"73a1d762-0b3b-4abf-9072-0b5bdba7bd72\") " pod="openstack/ssh-known-hosts-edpm-deployment-zqvg6" Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.162251 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktbmn\" (UniqueName: \"kubernetes.io/projected/73a1d762-0b3b-4abf-9072-0b5bdba7bd72-kube-api-access-ktbmn\") pod \"ssh-known-hosts-edpm-deployment-zqvg6\" (UID: \"73a1d762-0b3b-4abf-9072-0b5bdba7bd72\") " pod="openstack/ssh-known-hosts-edpm-deployment-zqvg6" Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.345950 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-zqvg6" Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.730884 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa67ce39-3813-47cd-9f8a-d73797769264" path="/var/lib/kubelet/pods/aa67ce39-3813-47cd-9f8a-d73797769264/volumes" Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.892614 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-zqvg6"] Dec 01 20:38:12 crc kubenswrapper[4802]: W1201 20:38:12.899114 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73a1d762_0b3b_4abf_9072_0b5bdba7bd72.slice/crio-fd2513284e4f2d8e3dd44f898c58f162386db7da63b717be5c0fef3935f2c0b0 WatchSource:0}: Error finding container fd2513284e4f2d8e3dd44f898c58f162386db7da63b717be5c0fef3935f2c0b0: Status 404 returned error can't find the container with id fd2513284e4f2d8e3dd44f898c58f162386db7da63b717be5c0fef3935f2c0b0 Dec 01 20:38:12 crc kubenswrapper[4802]: I1201 20:38:12.937173 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-zqvg6" event={"ID":"73a1d762-0b3b-4abf-9072-0b5bdba7bd72","Type":"ContainerStarted","Data":"fd2513284e4f2d8e3dd44f898c58f162386db7da63b717be5c0fef3935f2c0b0"} Dec 01 20:38:13 crc kubenswrapper[4802]: I1201 20:38:13.720315 4802 scope.go:117] "RemoveContainer" containerID="e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88" Dec 01 20:38:13 crc kubenswrapper[4802]: E1201 20:38:13.720817 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:38:14 crc kubenswrapper[4802]: I1201 20:38:14.976175 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-zqvg6" event={"ID":"73a1d762-0b3b-4abf-9072-0b5bdba7bd72","Type":"ContainerStarted","Data":"4222509ae1eea7c4a27acc599ca4923378a22179e5293dafc885abd5d150152c"} Dec 01 20:38:23 crc kubenswrapper[4802]: I1201 20:38:23.046351 4802 generic.go:334] "Generic (PLEG): container finished" podID="73a1d762-0b3b-4abf-9072-0b5bdba7bd72" containerID="4222509ae1eea7c4a27acc599ca4923378a22179e5293dafc885abd5d150152c" exitCode=0 Dec 01 20:38:23 crc kubenswrapper[4802]: I1201 20:38:23.046426 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-zqvg6" event={"ID":"73a1d762-0b3b-4abf-9072-0b5bdba7bd72","Type":"ContainerDied","Data":"4222509ae1eea7c4a27acc599ca4923378a22179e5293dafc885abd5d150152c"} Dec 01 20:38:24 crc kubenswrapper[4802]: I1201 20:38:24.486808 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-zqvg6" Dec 01 20:38:24 crc kubenswrapper[4802]: I1201 20:38:24.588258 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73a1d762-0b3b-4abf-9072-0b5bdba7bd72-ceph\") pod \"73a1d762-0b3b-4abf-9072-0b5bdba7bd72\" (UID: \"73a1d762-0b3b-4abf-9072-0b5bdba7bd72\") " Dec 01 20:38:24 crc kubenswrapper[4802]: I1201 20:38:24.588409 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/73a1d762-0b3b-4abf-9072-0b5bdba7bd72-ssh-key-openstack-edpm-ipam\") pod \"73a1d762-0b3b-4abf-9072-0b5bdba7bd72\" (UID: \"73a1d762-0b3b-4abf-9072-0b5bdba7bd72\") " Dec 01 20:38:24 crc kubenswrapper[4802]: I1201 20:38:24.588596 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktbmn\" (UniqueName: \"kubernetes.io/projected/73a1d762-0b3b-4abf-9072-0b5bdba7bd72-kube-api-access-ktbmn\") pod \"73a1d762-0b3b-4abf-9072-0b5bdba7bd72\" (UID: \"73a1d762-0b3b-4abf-9072-0b5bdba7bd72\") " Dec 01 20:38:24 crc kubenswrapper[4802]: I1201 20:38:24.589353 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/73a1d762-0b3b-4abf-9072-0b5bdba7bd72-inventory-0\") pod \"73a1d762-0b3b-4abf-9072-0b5bdba7bd72\" (UID: \"73a1d762-0b3b-4abf-9072-0b5bdba7bd72\") " Dec 01 20:38:24 crc kubenswrapper[4802]: I1201 20:38:24.594097 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a1d762-0b3b-4abf-9072-0b5bdba7bd72-ceph" (OuterVolumeSpecName: "ceph") pod "73a1d762-0b3b-4abf-9072-0b5bdba7bd72" (UID: "73a1d762-0b3b-4abf-9072-0b5bdba7bd72"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:38:24 crc kubenswrapper[4802]: I1201 20:38:24.599292 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73a1d762-0b3b-4abf-9072-0b5bdba7bd72-kube-api-access-ktbmn" (OuterVolumeSpecName: "kube-api-access-ktbmn") pod "73a1d762-0b3b-4abf-9072-0b5bdba7bd72" (UID: "73a1d762-0b3b-4abf-9072-0b5bdba7bd72"). InnerVolumeSpecName "kube-api-access-ktbmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:38:24 crc kubenswrapper[4802]: I1201 20:38:24.617258 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a1d762-0b3b-4abf-9072-0b5bdba7bd72-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "73a1d762-0b3b-4abf-9072-0b5bdba7bd72" (UID: "73a1d762-0b3b-4abf-9072-0b5bdba7bd72"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:38:24 crc kubenswrapper[4802]: I1201 20:38:24.617745 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a1d762-0b3b-4abf-9072-0b5bdba7bd72-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "73a1d762-0b3b-4abf-9072-0b5bdba7bd72" (UID: "73a1d762-0b3b-4abf-9072-0b5bdba7bd72"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:38:24 crc kubenswrapper[4802]: I1201 20:38:24.690901 4802 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/73a1d762-0b3b-4abf-9072-0b5bdba7bd72-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 01 20:38:24 crc kubenswrapper[4802]: I1201 20:38:24.690933 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktbmn\" (UniqueName: \"kubernetes.io/projected/73a1d762-0b3b-4abf-9072-0b5bdba7bd72-kube-api-access-ktbmn\") on node \"crc\" DevicePath \"\"" Dec 01 20:38:24 crc kubenswrapper[4802]: I1201 20:38:24.690943 4802 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/73a1d762-0b3b-4abf-9072-0b5bdba7bd72-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 01 20:38:24 crc kubenswrapper[4802]: I1201 20:38:24.690956 4802 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73a1d762-0b3b-4abf-9072-0b5bdba7bd72-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 20:38:25 crc kubenswrapper[4802]: I1201 20:38:25.077818 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-zqvg6" event={"ID":"73a1d762-0b3b-4abf-9072-0b5bdba7bd72","Type":"ContainerDied","Data":"fd2513284e4f2d8e3dd44f898c58f162386db7da63b717be5c0fef3935f2c0b0"} Dec 01 20:38:25 crc kubenswrapper[4802]: I1201 20:38:25.078148 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd2513284e4f2d8e3dd44f898c58f162386db7da63b717be5c0fef3935f2c0b0" Dec 01 20:38:25 crc kubenswrapper[4802]: I1201 20:38:25.077860 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-zqvg6" Dec 01 20:38:25 crc kubenswrapper[4802]: I1201 20:38:25.140565 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-8jtzl"] Dec 01 20:38:25 crc kubenswrapper[4802]: E1201 20:38:25.141058 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a1d762-0b3b-4abf-9072-0b5bdba7bd72" containerName="ssh-known-hosts-edpm-deployment" Dec 01 20:38:25 crc kubenswrapper[4802]: I1201 20:38:25.141081 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a1d762-0b3b-4abf-9072-0b5bdba7bd72" containerName="ssh-known-hosts-edpm-deployment" Dec 01 20:38:25 crc kubenswrapper[4802]: I1201 20:38:25.141305 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="73a1d762-0b3b-4abf-9072-0b5bdba7bd72" containerName="ssh-known-hosts-edpm-deployment" Dec 01 20:38:25 crc kubenswrapper[4802]: I1201 20:38:25.142040 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8jtzl" Dec 01 20:38:25 crc kubenswrapper[4802]: I1201 20:38:25.143893 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 20:38:25 crc kubenswrapper[4802]: I1201 20:38:25.145555 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wx8v9" Dec 01 20:38:25 crc kubenswrapper[4802]: I1201 20:38:25.145798 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 20:38:25 crc kubenswrapper[4802]: I1201 20:38:25.145972 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 20:38:25 crc kubenswrapper[4802]: I1201 20:38:25.146188 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 20:38:25 crc kubenswrapper[4802]: I1201 20:38:25.164312 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-8jtzl"] Dec 01 20:38:25 crc kubenswrapper[4802]: I1201 20:38:25.199005 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73e55b8c-927e-43fa-9104-8db3dc67fdde-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8jtzl\" (UID: \"73e55b8c-927e-43fa-9104-8db3dc67fdde\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8jtzl" Dec 01 20:38:25 crc kubenswrapper[4802]: I1201 20:38:25.199113 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73e55b8c-927e-43fa-9104-8db3dc67fdde-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8jtzl\" (UID: \"73e55b8c-927e-43fa-9104-8db3dc67fdde\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8jtzl" Dec 01 20:38:25 crc kubenswrapper[4802]: I1201 20:38:25.199136 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc7qk\" (UniqueName: \"kubernetes.io/projected/73e55b8c-927e-43fa-9104-8db3dc67fdde-kube-api-access-wc7qk\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8jtzl\" (UID: \"73e55b8c-927e-43fa-9104-8db3dc67fdde\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8jtzl" Dec 01 20:38:25 crc kubenswrapper[4802]: I1201 20:38:25.199244 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73e55b8c-927e-43fa-9104-8db3dc67fdde-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8jtzl\" (UID: \"73e55b8c-927e-43fa-9104-8db3dc67fdde\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8jtzl" Dec 01 20:38:25 crc kubenswrapper[4802]: I1201 20:38:25.301545 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73e55b8c-927e-43fa-9104-8db3dc67fdde-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8jtzl\" (UID: \"73e55b8c-927e-43fa-9104-8db3dc67fdde\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8jtzl" Dec 01 20:38:25 crc kubenswrapper[4802]: I1201 20:38:25.301592 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73e55b8c-927e-43fa-9104-8db3dc67fdde-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8jtzl\" (UID: \"73e55b8c-927e-43fa-9104-8db3dc67fdde\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8jtzl" Dec 01 20:38:25 crc kubenswrapper[4802]: I1201 20:38:25.301620 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc7qk\" (UniqueName: \"kubernetes.io/projected/73e55b8c-927e-43fa-9104-8db3dc67fdde-kube-api-access-wc7qk\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8jtzl\" (UID: \"73e55b8c-927e-43fa-9104-8db3dc67fdde\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8jtzl" Dec 01 20:38:25 crc kubenswrapper[4802]: I1201 20:38:25.301696 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73e55b8c-927e-43fa-9104-8db3dc67fdde-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8jtzl\" (UID: \"73e55b8c-927e-43fa-9104-8db3dc67fdde\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8jtzl" Dec 01 20:38:25 crc kubenswrapper[4802]: I1201 20:38:25.306327 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73e55b8c-927e-43fa-9104-8db3dc67fdde-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8jtzl\" (UID: \"73e55b8c-927e-43fa-9104-8db3dc67fdde\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8jtzl" Dec 01 20:38:25 crc kubenswrapper[4802]: I1201 20:38:25.306907 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73e55b8c-927e-43fa-9104-8db3dc67fdde-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8jtzl\" (UID: \"73e55b8c-927e-43fa-9104-8db3dc67fdde\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8jtzl" Dec 01 20:38:25 crc kubenswrapper[4802]: I1201 20:38:25.308480 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73e55b8c-927e-43fa-9104-8db3dc67fdde-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8jtzl\" (UID: \"73e55b8c-927e-43fa-9104-8db3dc67fdde\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8jtzl" Dec 01 20:38:25 crc kubenswrapper[4802]: I1201 20:38:25.327278 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc7qk\" (UniqueName: \"kubernetes.io/projected/73e55b8c-927e-43fa-9104-8db3dc67fdde-kube-api-access-wc7qk\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8jtzl\" (UID: \"73e55b8c-927e-43fa-9104-8db3dc67fdde\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8jtzl" Dec 01 20:38:25 crc kubenswrapper[4802]: I1201 20:38:25.465635 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8jtzl" Dec 01 20:38:25 crc kubenswrapper[4802]: I1201 20:38:25.979093 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-8jtzl"] Dec 01 20:38:26 crc kubenswrapper[4802]: I1201 20:38:26.085322 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8jtzl" event={"ID":"73e55b8c-927e-43fa-9104-8db3dc67fdde","Type":"ContainerStarted","Data":"0124ed503760a9fad44938c9dce12690e4bced2267d230e5b015d6f1239c2cd1"} Dec 01 20:38:26 crc kubenswrapper[4802]: I1201 20:38:26.720681 4802 scope.go:117] "RemoveContainer" containerID="e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88" Dec 01 20:38:26 crc kubenswrapper[4802]: E1201 20:38:26.721272 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:38:28 crc kubenswrapper[4802]: I1201 20:38:28.102943 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8jtzl" event={"ID":"73e55b8c-927e-43fa-9104-8db3dc67fdde","Type":"ContainerStarted","Data":"fafe72926af7e31dad76307b6d62c9c4e8e531762ff97ef2e23f6cc743d3ac06"} Dec 01 20:38:35 crc kubenswrapper[4802]: I1201 20:38:35.159818 4802 generic.go:334] "Generic (PLEG): container finished" podID="73e55b8c-927e-43fa-9104-8db3dc67fdde" containerID="fafe72926af7e31dad76307b6d62c9c4e8e531762ff97ef2e23f6cc743d3ac06" exitCode=0 Dec 01 20:38:35 crc kubenswrapper[4802]: I1201 20:38:35.159938 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8jtzl" event={"ID":"73e55b8c-927e-43fa-9104-8db3dc67fdde","Type":"ContainerDied","Data":"fafe72926af7e31dad76307b6d62c9c4e8e531762ff97ef2e23f6cc743d3ac06"} Dec 01 20:38:36 crc kubenswrapper[4802]: I1201 20:38:36.618780 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8jtzl" Dec 01 20:38:36 crc kubenswrapper[4802]: I1201 20:38:36.808666 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wc7qk\" (UniqueName: \"kubernetes.io/projected/73e55b8c-927e-43fa-9104-8db3dc67fdde-kube-api-access-wc7qk\") pod \"73e55b8c-927e-43fa-9104-8db3dc67fdde\" (UID: \"73e55b8c-927e-43fa-9104-8db3dc67fdde\") " Dec 01 20:38:36 crc kubenswrapper[4802]: I1201 20:38:36.808751 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73e55b8c-927e-43fa-9104-8db3dc67fdde-ssh-key\") pod \"73e55b8c-927e-43fa-9104-8db3dc67fdde\" (UID: \"73e55b8c-927e-43fa-9104-8db3dc67fdde\") " Dec 01 20:38:36 crc kubenswrapper[4802]: I1201 20:38:36.808949 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73e55b8c-927e-43fa-9104-8db3dc67fdde-inventory\") pod \"73e55b8c-927e-43fa-9104-8db3dc67fdde\" (UID: \"73e55b8c-927e-43fa-9104-8db3dc67fdde\") " Dec 01 20:38:36 crc kubenswrapper[4802]: I1201 20:38:36.809034 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73e55b8c-927e-43fa-9104-8db3dc67fdde-ceph\") pod \"73e55b8c-927e-43fa-9104-8db3dc67fdde\" (UID: \"73e55b8c-927e-43fa-9104-8db3dc67fdde\") " Dec 01 20:38:36 crc kubenswrapper[4802]: I1201 20:38:36.814177 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73e55b8c-927e-43fa-9104-8db3dc67fdde-kube-api-access-wc7qk" (OuterVolumeSpecName: "kube-api-access-wc7qk") pod "73e55b8c-927e-43fa-9104-8db3dc67fdde" (UID: "73e55b8c-927e-43fa-9104-8db3dc67fdde"). InnerVolumeSpecName "kube-api-access-wc7qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:38:36 crc kubenswrapper[4802]: I1201 20:38:36.818455 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73e55b8c-927e-43fa-9104-8db3dc67fdde-ceph" (OuterVolumeSpecName: "ceph") pod "73e55b8c-927e-43fa-9104-8db3dc67fdde" (UID: "73e55b8c-927e-43fa-9104-8db3dc67fdde"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:38:36 crc kubenswrapper[4802]: I1201 20:38:36.837848 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73e55b8c-927e-43fa-9104-8db3dc67fdde-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "73e55b8c-927e-43fa-9104-8db3dc67fdde" (UID: "73e55b8c-927e-43fa-9104-8db3dc67fdde"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:38:36 crc kubenswrapper[4802]: I1201 20:38:36.845769 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73e55b8c-927e-43fa-9104-8db3dc67fdde-inventory" (OuterVolumeSpecName: "inventory") pod "73e55b8c-927e-43fa-9104-8db3dc67fdde" (UID: "73e55b8c-927e-43fa-9104-8db3dc67fdde"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:38:36 crc kubenswrapper[4802]: I1201 20:38:36.911742 4802 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73e55b8c-927e-43fa-9104-8db3dc67fdde-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 20:38:36 crc kubenswrapper[4802]: I1201 20:38:36.911784 4802 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73e55b8c-927e-43fa-9104-8db3dc67fdde-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 20:38:36 crc kubenswrapper[4802]: I1201 20:38:36.911796 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wc7qk\" (UniqueName: \"kubernetes.io/projected/73e55b8c-927e-43fa-9104-8db3dc67fdde-kube-api-access-wc7qk\") on node \"crc\" DevicePath \"\"" Dec 01 20:38:36 crc kubenswrapper[4802]: I1201 20:38:36.911808 4802 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73e55b8c-927e-43fa-9104-8db3dc67fdde-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 20:38:37 crc kubenswrapper[4802]: I1201 20:38:37.179170 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8jtzl" event={"ID":"73e55b8c-927e-43fa-9104-8db3dc67fdde","Type":"ContainerDied","Data":"0124ed503760a9fad44938c9dce12690e4bced2267d230e5b015d6f1239c2cd1"} Dec 01 20:38:37 crc kubenswrapper[4802]: I1201 20:38:37.179231 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0124ed503760a9fad44938c9dce12690e4bced2267d230e5b015d6f1239c2cd1" Dec 01 20:38:37 crc kubenswrapper[4802]: I1201 20:38:37.179303 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8jtzl" Dec 01 20:38:37 crc kubenswrapper[4802]: I1201 20:38:37.255627 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc"] Dec 01 20:38:37 crc kubenswrapper[4802]: E1201 20:38:37.256036 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73e55b8c-927e-43fa-9104-8db3dc67fdde" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 20:38:37 crc kubenswrapper[4802]: I1201 20:38:37.256060 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="73e55b8c-927e-43fa-9104-8db3dc67fdde" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 20:38:37 crc kubenswrapper[4802]: I1201 20:38:37.256248 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="73e55b8c-927e-43fa-9104-8db3dc67fdde" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 20:38:37 crc kubenswrapper[4802]: I1201 20:38:37.256832 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc" Dec 01 20:38:37 crc kubenswrapper[4802]: I1201 20:38:37.259464 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 20:38:37 crc kubenswrapper[4802]: I1201 20:38:37.259790 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 20:38:37 crc kubenswrapper[4802]: I1201 20:38:37.260853 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wx8v9" Dec 01 20:38:37 crc kubenswrapper[4802]: I1201 20:38:37.260877 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 20:38:37 crc kubenswrapper[4802]: I1201 20:38:37.262561 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 20:38:37 crc kubenswrapper[4802]: I1201 20:38:37.273247 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc"] Dec 01 20:38:37 crc kubenswrapper[4802]: I1201 20:38:37.319503 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwmvd\" (UniqueName: \"kubernetes.io/projected/5daf0e64-4a00-45c3-9830-46f81436faff-kube-api-access-lwmvd\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc\" (UID: \"5daf0e64-4a00-45c3-9830-46f81436faff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc" Dec 01 20:38:37 crc kubenswrapper[4802]: I1201 20:38:37.319824 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5daf0e64-4a00-45c3-9830-46f81436faff-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc\" (UID: \"5daf0e64-4a00-45c3-9830-46f81436faff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc" Dec 01 20:38:37 crc kubenswrapper[4802]: I1201 20:38:37.319925 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5daf0e64-4a00-45c3-9830-46f81436faff-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc\" (UID: \"5daf0e64-4a00-45c3-9830-46f81436faff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc" Dec 01 20:38:37 crc kubenswrapper[4802]: I1201 20:38:37.320011 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5daf0e64-4a00-45c3-9830-46f81436faff-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc\" (UID: \"5daf0e64-4a00-45c3-9830-46f81436faff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc" Dec 01 20:38:37 crc kubenswrapper[4802]: I1201 20:38:37.421249 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwmvd\" (UniqueName: \"kubernetes.io/projected/5daf0e64-4a00-45c3-9830-46f81436faff-kube-api-access-lwmvd\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc\" (UID: \"5daf0e64-4a00-45c3-9830-46f81436faff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc" Dec 01 20:38:37 crc kubenswrapper[4802]: I1201 20:38:37.421468 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5daf0e64-4a00-45c3-9830-46f81436faff-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc\" (UID: \"5daf0e64-4a00-45c3-9830-46f81436faff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc" Dec 01 20:38:37 crc kubenswrapper[4802]: I1201 20:38:37.421520 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5daf0e64-4a00-45c3-9830-46f81436faff-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc\" (UID: \"5daf0e64-4a00-45c3-9830-46f81436faff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc" Dec 01 20:38:37 crc kubenswrapper[4802]: I1201 20:38:37.421585 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5daf0e64-4a00-45c3-9830-46f81436faff-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc\" (UID: \"5daf0e64-4a00-45c3-9830-46f81436faff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc" Dec 01 20:38:37 crc kubenswrapper[4802]: I1201 20:38:37.424779 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5daf0e64-4a00-45c3-9830-46f81436faff-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc\" (UID: \"5daf0e64-4a00-45c3-9830-46f81436faff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc" Dec 01 20:38:37 crc kubenswrapper[4802]: I1201 20:38:37.425449 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5daf0e64-4a00-45c3-9830-46f81436faff-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc\" (UID: \"5daf0e64-4a00-45c3-9830-46f81436faff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc" Dec 01 20:38:37 crc kubenswrapper[4802]: I1201 20:38:37.425760 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5daf0e64-4a00-45c3-9830-46f81436faff-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc\" (UID: \"5daf0e64-4a00-45c3-9830-46f81436faff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc" Dec 01 20:38:37 crc kubenswrapper[4802]: I1201 20:38:37.439123 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwmvd\" (UniqueName: \"kubernetes.io/projected/5daf0e64-4a00-45c3-9830-46f81436faff-kube-api-access-lwmvd\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc\" (UID: \"5daf0e64-4a00-45c3-9830-46f81436faff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc" Dec 01 20:38:37 crc kubenswrapper[4802]: I1201 20:38:37.575680 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc" Dec 01 20:38:37 crc kubenswrapper[4802]: I1201 20:38:37.720884 4802 scope.go:117] "RemoveContainer" containerID="e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88" Dec 01 20:38:38 crc kubenswrapper[4802]: I1201 20:38:38.064894 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc"] Dec 01 20:38:38 crc kubenswrapper[4802]: I1201 20:38:38.187915 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc" event={"ID":"5daf0e64-4a00-45c3-9830-46f81436faff","Type":"ContainerStarted","Data":"eb0847378aa14fc2c398a9edb3aa9d3a63e6c1a96035263397c0887962c9add0"} Dec 01 20:38:38 crc kubenswrapper[4802]: I1201 20:38:38.190816 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerStarted","Data":"4b52124a617ff9aff5ae5f96828f2e1b5ca32a7fd19dd6889c883a07f00e9bf3"} Dec 01 20:38:40 crc kubenswrapper[4802]: I1201 20:38:40.205552 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc" event={"ID":"5daf0e64-4a00-45c3-9830-46f81436faff","Type":"ContainerStarted","Data":"d167736954f6036dec2403afd5409b0c30a6412ac0a780c181c88a5e22a97e4c"} Dec 01 20:38:40 crc kubenswrapper[4802]: I1201 20:38:40.228695 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc" podStartSLOduration=2.023080622 podStartE2EDuration="3.228676153s" podCreationTimestamp="2025-12-01 20:38:37 +0000 UTC" firstStartedPulling="2025-12-01 20:38:38.069229046 +0000 UTC m=+2539.631788687" lastFinishedPulling="2025-12-01 20:38:39.274824577 +0000 UTC m=+2540.837384218" observedRunningTime="2025-12-01 20:38:40.226238797 +0000 UTC m=+2541.788798478" watchObservedRunningTime="2025-12-01 20:38:40.228676153 +0000 UTC m=+2541.791235794" Dec 01 20:38:51 crc kubenswrapper[4802]: I1201 20:38:51.304803 4802 generic.go:334] "Generic (PLEG): container finished" podID="5daf0e64-4a00-45c3-9830-46f81436faff" containerID="d167736954f6036dec2403afd5409b0c30a6412ac0a780c181c88a5e22a97e4c" exitCode=0 Dec 01 20:38:51 crc kubenswrapper[4802]: I1201 20:38:51.304905 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc" event={"ID":"5daf0e64-4a00-45c3-9830-46f81436faff","Type":"ContainerDied","Data":"d167736954f6036dec2403afd5409b0c30a6412ac0a780c181c88a5e22a97e4c"} Dec 01 20:38:52 crc kubenswrapper[4802]: I1201 20:38:52.787473 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc" Dec 01 20:38:52 crc kubenswrapper[4802]: I1201 20:38:52.899260 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwmvd\" (UniqueName: \"kubernetes.io/projected/5daf0e64-4a00-45c3-9830-46f81436faff-kube-api-access-lwmvd\") pod \"5daf0e64-4a00-45c3-9830-46f81436faff\" (UID: \"5daf0e64-4a00-45c3-9830-46f81436faff\") " Dec 01 20:38:52 crc kubenswrapper[4802]: I1201 20:38:52.899498 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5daf0e64-4a00-45c3-9830-46f81436faff-inventory\") pod \"5daf0e64-4a00-45c3-9830-46f81436faff\" (UID: \"5daf0e64-4a00-45c3-9830-46f81436faff\") " Dec 01 20:38:52 crc kubenswrapper[4802]: I1201 20:38:52.899526 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5daf0e64-4a00-45c3-9830-46f81436faff-ceph\") pod \"5daf0e64-4a00-45c3-9830-46f81436faff\" (UID: \"5daf0e64-4a00-45c3-9830-46f81436faff\") " Dec 01 20:38:52 crc kubenswrapper[4802]: I1201 20:38:52.899638 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5daf0e64-4a00-45c3-9830-46f81436faff-ssh-key\") pod \"5daf0e64-4a00-45c3-9830-46f81436faff\" (UID: \"5daf0e64-4a00-45c3-9830-46f81436faff\") " Dec 01 20:38:52 crc kubenswrapper[4802]: I1201 20:38:52.904882 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5daf0e64-4a00-45c3-9830-46f81436faff-ceph" (OuterVolumeSpecName: "ceph") pod "5daf0e64-4a00-45c3-9830-46f81436faff" (UID: "5daf0e64-4a00-45c3-9830-46f81436faff"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:38:52 crc kubenswrapper[4802]: I1201 20:38:52.905749 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5daf0e64-4a00-45c3-9830-46f81436faff-kube-api-access-lwmvd" (OuterVolumeSpecName: "kube-api-access-lwmvd") pod "5daf0e64-4a00-45c3-9830-46f81436faff" (UID: "5daf0e64-4a00-45c3-9830-46f81436faff"). InnerVolumeSpecName "kube-api-access-lwmvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:38:52 crc kubenswrapper[4802]: I1201 20:38:52.924405 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5daf0e64-4a00-45c3-9830-46f81436faff-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5daf0e64-4a00-45c3-9830-46f81436faff" (UID: "5daf0e64-4a00-45c3-9830-46f81436faff"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:38:52 crc kubenswrapper[4802]: I1201 20:38:52.929998 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5daf0e64-4a00-45c3-9830-46f81436faff-inventory" (OuterVolumeSpecName: "inventory") pod "5daf0e64-4a00-45c3-9830-46f81436faff" (UID: "5daf0e64-4a00-45c3-9830-46f81436faff"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.001474 4802 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5daf0e64-4a00-45c3-9830-46f81436faff-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.001511 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwmvd\" (UniqueName: \"kubernetes.io/projected/5daf0e64-4a00-45c3-9830-46f81436faff-kube-api-access-lwmvd\") on node \"crc\" DevicePath \"\"" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.001526 4802 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5daf0e64-4a00-45c3-9830-46f81436faff-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.001539 4802 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5daf0e64-4a00-45c3-9830-46f81436faff-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.329082 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc" event={"ID":"5daf0e64-4a00-45c3-9830-46f81436faff","Type":"ContainerDied","Data":"eb0847378aa14fc2c398a9edb3aa9d3a63e6c1a96035263397c0887962c9add0"} Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.329403 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb0847378aa14fc2c398a9edb3aa9d3a63e6c1a96035263397c0887962c9add0" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.329324 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.520657 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg"] Dec 01 20:38:53 crc kubenswrapper[4802]: E1201 20:38:53.521184 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5daf0e64-4a00-45c3-9830-46f81436faff" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.521240 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="5daf0e64-4a00-45c3-9830-46f81436faff" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.521604 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="5daf0e64-4a00-45c3-9830-46f81436faff" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.522785 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.525480 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.525653 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wx8v9" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.525723 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.525847 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.526006 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.526491 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.527437 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.527740 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.538634 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg"] Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.714087 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckfkn\" (UniqueName: \"kubernetes.io/projected/189db1d5-3210-4707-b02f-8434a36a5791-kube-api-access-ckfkn\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.714165 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.714420 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.714497 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.714537 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.714721 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.714842 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.714892 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.715096 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.715138 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.715237 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/189db1d5-3210-4707-b02f-8434a36a5791-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.715307 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/189db1d5-3210-4707-b02f-8434a36a5791-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.715490 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/189db1d5-3210-4707-b02f-8434a36a5791-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.817427 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/189db1d5-3210-4707-b02f-8434a36a5791-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.817823 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckfkn\" (UniqueName: \"kubernetes.io/projected/189db1d5-3210-4707-b02f-8434a36a5791-kube-api-access-ckfkn\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.817871 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.819094 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.819148 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.819174 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.819228 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.819264 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.819286 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.819311 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.819338 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.819357 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/189db1d5-3210-4707-b02f-8434a36a5791-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.819378 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/189db1d5-3210-4707-b02f-8434a36a5791-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.822871 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.822949 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/189db1d5-3210-4707-b02f-8434a36a5791-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.823711 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.825981 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.827642 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.828050 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.828062 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.828174 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/189db1d5-3210-4707-b02f-8434a36a5791-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.828169 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/189db1d5-3210-4707-b02f-8434a36a5791-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.828693 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.829143 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.830027 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.847475 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckfkn\" (UniqueName: \"kubernetes.io/projected/189db1d5-3210-4707-b02f-8434a36a5791-kube-api-access-ckfkn\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-96mpg\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:53 crc kubenswrapper[4802]: I1201 20:38:53.856889 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:38:54 crc kubenswrapper[4802]: I1201 20:38:54.389479 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg"] Dec 01 20:38:55 crc kubenswrapper[4802]: I1201 20:38:55.346408 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" event={"ID":"189db1d5-3210-4707-b02f-8434a36a5791","Type":"ContainerStarted","Data":"3c654985a3c83981b1eeaca7144f653cf189c7c568af66a48ca16f091a492405"} Dec 01 20:38:56 crc kubenswrapper[4802]: I1201 20:38:56.356563 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" event={"ID":"189db1d5-3210-4707-b02f-8434a36a5791","Type":"ContainerStarted","Data":"4727369f03758bf24da18a24473cd9cdbf6f3682273c18b72255242677009fd1"} Dec 01 20:38:56 crc kubenswrapper[4802]: I1201 20:38:56.383578 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" podStartSLOduration=2.535700073 podStartE2EDuration="3.383559068s" podCreationTimestamp="2025-12-01 20:38:53 +0000 UTC" firstStartedPulling="2025-12-01 20:38:54.391525651 +0000 UTC m=+2555.954085312" lastFinishedPulling="2025-12-01 20:38:55.239384666 +0000 UTC m=+2556.801944307" observedRunningTime="2025-12-01 20:38:56.383442805 +0000 UTC m=+2557.946002456" watchObservedRunningTime="2025-12-01 20:38:56.383559068 +0000 UTC m=+2557.946118719" Dec 01 20:39:26 crc kubenswrapper[4802]: I1201 20:39:26.600308 4802 generic.go:334] "Generic (PLEG): container finished" podID="189db1d5-3210-4707-b02f-8434a36a5791" containerID="4727369f03758bf24da18a24473cd9cdbf6f3682273c18b72255242677009fd1" exitCode=0 Dec 01 20:39:26 crc kubenswrapper[4802]: I1201 20:39:26.600418 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" event={"ID":"189db1d5-3210-4707-b02f-8434a36a5791","Type":"ContainerDied","Data":"4727369f03758bf24da18a24473cd9cdbf6f3682273c18b72255242677009fd1"} Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.018256 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.122247 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-inventory\") pod \"189db1d5-3210-4707-b02f-8434a36a5791\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.122582 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-nova-combined-ca-bundle\") pod \"189db1d5-3210-4707-b02f-8434a36a5791\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.122606 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-repo-setup-combined-ca-bundle\") pod \"189db1d5-3210-4707-b02f-8434a36a5791\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.122632 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-ovn-combined-ca-bundle\") pod \"189db1d5-3210-4707-b02f-8434a36a5791\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.122650 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-bootstrap-combined-ca-bundle\") pod \"189db1d5-3210-4707-b02f-8434a36a5791\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.122681 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/189db1d5-3210-4707-b02f-8434a36a5791-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"189db1d5-3210-4707-b02f-8434a36a5791\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.122700 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckfkn\" (UniqueName: \"kubernetes.io/projected/189db1d5-3210-4707-b02f-8434a36a5791-kube-api-access-ckfkn\") pod \"189db1d5-3210-4707-b02f-8434a36a5791\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.122733 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-ssh-key\") pod \"189db1d5-3210-4707-b02f-8434a36a5791\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.122751 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/189db1d5-3210-4707-b02f-8434a36a5791-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"189db1d5-3210-4707-b02f-8434a36a5791\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.122780 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-ceph\") pod \"189db1d5-3210-4707-b02f-8434a36a5791\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.122797 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/189db1d5-3210-4707-b02f-8434a36a5791-openstack-edpm-ipam-ovn-default-certs-0\") pod \"189db1d5-3210-4707-b02f-8434a36a5791\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.122828 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-neutron-metadata-combined-ca-bundle\") pod \"189db1d5-3210-4707-b02f-8434a36a5791\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.122854 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-libvirt-combined-ca-bundle\") pod \"189db1d5-3210-4707-b02f-8434a36a5791\" (UID: \"189db1d5-3210-4707-b02f-8434a36a5791\") " Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.129423 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/189db1d5-3210-4707-b02f-8434a36a5791-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "189db1d5-3210-4707-b02f-8434a36a5791" (UID: "189db1d5-3210-4707-b02f-8434a36a5791"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.130187 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-ceph" (OuterVolumeSpecName: "ceph") pod "189db1d5-3210-4707-b02f-8434a36a5791" (UID: "189db1d5-3210-4707-b02f-8434a36a5791"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.130188 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/189db1d5-3210-4707-b02f-8434a36a5791-kube-api-access-ckfkn" (OuterVolumeSpecName: "kube-api-access-ckfkn") pod "189db1d5-3210-4707-b02f-8434a36a5791" (UID: "189db1d5-3210-4707-b02f-8434a36a5791"). InnerVolumeSpecName "kube-api-access-ckfkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.130255 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "189db1d5-3210-4707-b02f-8434a36a5791" (UID: "189db1d5-3210-4707-b02f-8434a36a5791"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.130580 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "189db1d5-3210-4707-b02f-8434a36a5791" (UID: "189db1d5-3210-4707-b02f-8434a36a5791"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.130993 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "189db1d5-3210-4707-b02f-8434a36a5791" (UID: "189db1d5-3210-4707-b02f-8434a36a5791"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.133815 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "189db1d5-3210-4707-b02f-8434a36a5791" (UID: "189db1d5-3210-4707-b02f-8434a36a5791"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.136239 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/189db1d5-3210-4707-b02f-8434a36a5791-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "189db1d5-3210-4707-b02f-8434a36a5791" (UID: "189db1d5-3210-4707-b02f-8434a36a5791"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.136475 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "189db1d5-3210-4707-b02f-8434a36a5791" (UID: "189db1d5-3210-4707-b02f-8434a36a5791"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.136597 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/189db1d5-3210-4707-b02f-8434a36a5791-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "189db1d5-3210-4707-b02f-8434a36a5791" (UID: "189db1d5-3210-4707-b02f-8434a36a5791"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.136822 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "189db1d5-3210-4707-b02f-8434a36a5791" (UID: "189db1d5-3210-4707-b02f-8434a36a5791"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.155746 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-inventory" (OuterVolumeSpecName: "inventory") pod "189db1d5-3210-4707-b02f-8434a36a5791" (UID: "189db1d5-3210-4707-b02f-8434a36a5791"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.167357 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "189db1d5-3210-4707-b02f-8434a36a5791" (UID: "189db1d5-3210-4707-b02f-8434a36a5791"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.225939 4802 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.225988 4802 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.226000 4802 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.226011 4802 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.226024 4802 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/189db1d5-3210-4707-b02f-8434a36a5791-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.226036 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckfkn\" (UniqueName: \"kubernetes.io/projected/189db1d5-3210-4707-b02f-8434a36a5791-kube-api-access-ckfkn\") on node \"crc\" DevicePath \"\"" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.226046 4802 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.226056 4802 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/189db1d5-3210-4707-b02f-8434a36a5791-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.226066 4802 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.226076 4802 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/189db1d5-3210-4707-b02f-8434a36a5791-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.226088 4802 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.226100 4802 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.226110 4802 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/189db1d5-3210-4707-b02f-8434a36a5791-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.617364 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" event={"ID":"189db1d5-3210-4707-b02f-8434a36a5791","Type":"ContainerDied","Data":"3c654985a3c83981b1eeaca7144f653cf189c7c568af66a48ca16f091a492405"} Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.617403 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c654985a3c83981b1eeaca7144f653cf189c7c568af66a48ca16f091a492405" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.617425 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-96mpg" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.710668 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp"] Dec 01 20:39:28 crc kubenswrapper[4802]: E1201 20:39:28.711172 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="189db1d5-3210-4707-b02f-8434a36a5791" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.711186 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="189db1d5-3210-4707-b02f-8434a36a5791" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.711478 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="189db1d5-3210-4707-b02f-8434a36a5791" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.714109 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.716932 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.717309 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.717609 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.717728 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.717775 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wx8v9" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.741324 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp"] Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.833627 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7db48dd7-0156-4073-918a-5f4e4c1244d9-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp\" (UID: \"7db48dd7-0156-4073-918a-5f4e4c1244d9\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.833668 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h668w\" (UniqueName: \"kubernetes.io/projected/7db48dd7-0156-4073-918a-5f4e4c1244d9-kube-api-access-h668w\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp\" (UID: \"7db48dd7-0156-4073-918a-5f4e4c1244d9\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.834796 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7db48dd7-0156-4073-918a-5f4e4c1244d9-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp\" (UID: \"7db48dd7-0156-4073-918a-5f4e4c1244d9\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.834990 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7db48dd7-0156-4073-918a-5f4e4c1244d9-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp\" (UID: \"7db48dd7-0156-4073-918a-5f4e4c1244d9\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.936940 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7db48dd7-0156-4073-918a-5f4e4c1244d9-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp\" (UID: \"7db48dd7-0156-4073-918a-5f4e4c1244d9\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.936997 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7db48dd7-0156-4073-918a-5f4e4c1244d9-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp\" (UID: \"7db48dd7-0156-4073-918a-5f4e4c1244d9\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.937017 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h668w\" (UniqueName: \"kubernetes.io/projected/7db48dd7-0156-4073-918a-5f4e4c1244d9-kube-api-access-h668w\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp\" (UID: \"7db48dd7-0156-4073-918a-5f4e4c1244d9\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.937144 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7db48dd7-0156-4073-918a-5f4e4c1244d9-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp\" (UID: \"7db48dd7-0156-4073-918a-5f4e4c1244d9\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.940494 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7db48dd7-0156-4073-918a-5f4e4c1244d9-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp\" (UID: \"7db48dd7-0156-4073-918a-5f4e4c1244d9\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.940507 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7db48dd7-0156-4073-918a-5f4e4c1244d9-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp\" (UID: \"7db48dd7-0156-4073-918a-5f4e4c1244d9\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.941931 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7db48dd7-0156-4073-918a-5f4e4c1244d9-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp\" (UID: \"7db48dd7-0156-4073-918a-5f4e4c1244d9\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp" Dec 01 20:39:28 crc kubenswrapper[4802]: I1201 20:39:28.959345 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h668w\" (UniqueName: \"kubernetes.io/projected/7db48dd7-0156-4073-918a-5f4e4c1244d9-kube-api-access-h668w\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp\" (UID: \"7db48dd7-0156-4073-918a-5f4e4c1244d9\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp" Dec 01 20:39:29 crc kubenswrapper[4802]: I1201 20:39:29.033131 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp" Dec 01 20:39:29 crc kubenswrapper[4802]: I1201 20:39:29.557242 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp"] Dec 01 20:39:29 crc kubenswrapper[4802]: I1201 20:39:29.628012 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp" event={"ID":"7db48dd7-0156-4073-918a-5f4e4c1244d9","Type":"ContainerStarted","Data":"188ae1da1e3a6c9b7c1c22c72270f5d66879a0adf51d851d200a0577b4c2bb59"} Dec 01 20:39:30 crc kubenswrapper[4802]: I1201 20:39:30.640024 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp" event={"ID":"7db48dd7-0156-4073-918a-5f4e4c1244d9","Type":"ContainerStarted","Data":"afc005aba0c0843734ec48b57783d44ca276ef8253e158ff086af83e7e0daf68"} Dec 01 20:39:30 crc kubenswrapper[4802]: I1201 20:39:30.666650 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp" podStartSLOduration=2.05892919 podStartE2EDuration="2.66661378s" podCreationTimestamp="2025-12-01 20:39:28 +0000 UTC" firstStartedPulling="2025-12-01 20:39:29.566718974 +0000 UTC m=+2591.129278615" lastFinishedPulling="2025-12-01 20:39:30.174403564 +0000 UTC m=+2591.736963205" observedRunningTime="2025-12-01 20:39:30.661350827 +0000 UTC m=+2592.223910468" watchObservedRunningTime="2025-12-01 20:39:30.66661378 +0000 UTC m=+2592.229173441" Dec 01 20:39:35 crc kubenswrapper[4802]: I1201 20:39:35.686650 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp" event={"ID":"7db48dd7-0156-4073-918a-5f4e4c1244d9","Type":"ContainerDied","Data":"afc005aba0c0843734ec48b57783d44ca276ef8253e158ff086af83e7e0daf68"} Dec 01 20:39:35 crc kubenswrapper[4802]: I1201 20:39:35.686686 4802 generic.go:334] "Generic (PLEG): container finished" podID="7db48dd7-0156-4073-918a-5f4e4c1244d9" containerID="afc005aba0c0843734ec48b57783d44ca276ef8253e158ff086af83e7e0daf68" exitCode=0 Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.109254 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp" Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.298344 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7db48dd7-0156-4073-918a-5f4e4c1244d9-ssh-key\") pod \"7db48dd7-0156-4073-918a-5f4e4c1244d9\" (UID: \"7db48dd7-0156-4073-918a-5f4e4c1244d9\") " Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.298443 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7db48dd7-0156-4073-918a-5f4e4c1244d9-inventory\") pod \"7db48dd7-0156-4073-918a-5f4e4c1244d9\" (UID: \"7db48dd7-0156-4073-918a-5f4e4c1244d9\") " Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.298510 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h668w\" (UniqueName: \"kubernetes.io/projected/7db48dd7-0156-4073-918a-5f4e4c1244d9-kube-api-access-h668w\") pod \"7db48dd7-0156-4073-918a-5f4e4c1244d9\" (UID: \"7db48dd7-0156-4073-918a-5f4e4c1244d9\") " Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.298605 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7db48dd7-0156-4073-918a-5f4e4c1244d9-ceph\") pod \"7db48dd7-0156-4073-918a-5f4e4c1244d9\" (UID: \"7db48dd7-0156-4073-918a-5f4e4c1244d9\") " Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.308441 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7db48dd7-0156-4073-918a-5f4e4c1244d9-kube-api-access-h668w" (OuterVolumeSpecName: "kube-api-access-h668w") pod "7db48dd7-0156-4073-918a-5f4e4c1244d9" (UID: "7db48dd7-0156-4073-918a-5f4e4c1244d9"). InnerVolumeSpecName "kube-api-access-h668w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.310316 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7db48dd7-0156-4073-918a-5f4e4c1244d9-ceph" (OuterVolumeSpecName: "ceph") pod "7db48dd7-0156-4073-918a-5f4e4c1244d9" (UID: "7db48dd7-0156-4073-918a-5f4e4c1244d9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.339910 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7db48dd7-0156-4073-918a-5f4e4c1244d9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7db48dd7-0156-4073-918a-5f4e4c1244d9" (UID: "7db48dd7-0156-4073-918a-5f4e4c1244d9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.341305 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7db48dd7-0156-4073-918a-5f4e4c1244d9-inventory" (OuterVolumeSpecName: "inventory") pod "7db48dd7-0156-4073-918a-5f4e4c1244d9" (UID: "7db48dd7-0156-4073-918a-5f4e4c1244d9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.401692 4802 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7db48dd7-0156-4073-918a-5f4e4c1244d9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.401740 4802 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7db48dd7-0156-4073-918a-5f4e4c1244d9-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.401755 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h668w\" (UniqueName: \"kubernetes.io/projected/7db48dd7-0156-4073-918a-5f4e4c1244d9-kube-api-access-h668w\") on node \"crc\" DevicePath \"\"" Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.401770 4802 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7db48dd7-0156-4073-918a-5f4e4c1244d9-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.703564 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp" event={"ID":"7db48dd7-0156-4073-918a-5f4e4c1244d9","Type":"ContainerDied","Data":"188ae1da1e3a6c9b7c1c22c72270f5d66879a0adf51d851d200a0577b4c2bb59"} Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.703604 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="188ae1da1e3a6c9b7c1c22c72270f5d66879a0adf51d851d200a0577b4c2bb59" Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.703662 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp" Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.777831 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj"] Dec 01 20:39:37 crc kubenswrapper[4802]: E1201 20:39:37.778290 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7db48dd7-0156-4073-918a-5f4e4c1244d9" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.778314 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db48dd7-0156-4073-918a-5f4e4c1244d9" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.778576 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="7db48dd7-0156-4073-918a-5f4e4c1244d9" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.779301 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj" Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.781773 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.781918 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.781960 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.782104 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.782166 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wx8v9" Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.782292 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.792583 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj"] Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.912813 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6c2f2991-db4b-4a58-807a-f3b617d9542f-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zpzvj\" (UID: \"6c2f2991-db4b-4a58-807a-f3b617d9542f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj" Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.913060 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6c2f2991-db4b-4a58-807a-f3b617d9542f-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zpzvj\" (UID: \"6c2f2991-db4b-4a58-807a-f3b617d9542f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj" Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.913097 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c2f2991-db4b-4a58-807a-f3b617d9542f-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zpzvj\" (UID: \"6c2f2991-db4b-4a58-807a-f3b617d9542f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj" Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.913279 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwt9s\" (UniqueName: \"kubernetes.io/projected/6c2f2991-db4b-4a58-807a-f3b617d9542f-kube-api-access-dwt9s\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zpzvj\" (UID: \"6c2f2991-db4b-4a58-807a-f3b617d9542f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj" Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.913407 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6c2f2991-db4b-4a58-807a-f3b617d9542f-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zpzvj\" (UID: \"6c2f2991-db4b-4a58-807a-f3b617d9542f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj" Dec 01 20:39:37 crc kubenswrapper[4802]: I1201 20:39:37.913431 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c2f2991-db4b-4a58-807a-f3b617d9542f-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zpzvj\" (UID: \"6c2f2991-db4b-4a58-807a-f3b617d9542f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj" Dec 01 20:39:38 crc kubenswrapper[4802]: I1201 20:39:38.015070 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6c2f2991-db4b-4a58-807a-f3b617d9542f-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zpzvj\" (UID: \"6c2f2991-db4b-4a58-807a-f3b617d9542f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj" Dec 01 20:39:38 crc kubenswrapper[4802]: I1201 20:39:38.015478 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c2f2991-db4b-4a58-807a-f3b617d9542f-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zpzvj\" (UID: \"6c2f2991-db4b-4a58-807a-f3b617d9542f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj" Dec 01 20:39:38 crc kubenswrapper[4802]: I1201 20:39:38.015562 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwt9s\" (UniqueName: \"kubernetes.io/projected/6c2f2991-db4b-4a58-807a-f3b617d9542f-kube-api-access-dwt9s\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zpzvj\" (UID: \"6c2f2991-db4b-4a58-807a-f3b617d9542f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj" Dec 01 20:39:38 crc kubenswrapper[4802]: I1201 20:39:38.015639 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6c2f2991-db4b-4a58-807a-f3b617d9542f-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zpzvj\" (UID: \"6c2f2991-db4b-4a58-807a-f3b617d9542f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj" Dec 01 20:39:38 crc kubenswrapper[4802]: I1201 20:39:38.015668 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c2f2991-db4b-4a58-807a-f3b617d9542f-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zpzvj\" (UID: \"6c2f2991-db4b-4a58-807a-f3b617d9542f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj" Dec 01 20:39:38 crc kubenswrapper[4802]: I1201 20:39:38.015701 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6c2f2991-db4b-4a58-807a-f3b617d9542f-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zpzvj\" (UID: \"6c2f2991-db4b-4a58-807a-f3b617d9542f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj" Dec 01 20:39:38 crc kubenswrapper[4802]: I1201 20:39:38.016182 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6c2f2991-db4b-4a58-807a-f3b617d9542f-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zpzvj\" (UID: \"6c2f2991-db4b-4a58-807a-f3b617d9542f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj" Dec 01 20:39:38 crc kubenswrapper[4802]: I1201 20:39:38.021082 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6c2f2991-db4b-4a58-807a-f3b617d9542f-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zpzvj\" (UID: \"6c2f2991-db4b-4a58-807a-f3b617d9542f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj" Dec 01 20:39:38 crc kubenswrapper[4802]: I1201 20:39:38.026640 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6c2f2991-db4b-4a58-807a-f3b617d9542f-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zpzvj\" (UID: \"6c2f2991-db4b-4a58-807a-f3b617d9542f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj" Dec 01 20:39:38 crc kubenswrapper[4802]: I1201 20:39:38.034747 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c2f2991-db4b-4a58-807a-f3b617d9542f-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zpzvj\" (UID: \"6c2f2991-db4b-4a58-807a-f3b617d9542f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj" Dec 01 20:39:38 crc kubenswrapper[4802]: I1201 20:39:38.034827 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c2f2991-db4b-4a58-807a-f3b617d9542f-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zpzvj\" (UID: \"6c2f2991-db4b-4a58-807a-f3b617d9542f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj" Dec 01 20:39:38 crc kubenswrapper[4802]: I1201 20:39:38.037102 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwt9s\" (UniqueName: \"kubernetes.io/projected/6c2f2991-db4b-4a58-807a-f3b617d9542f-kube-api-access-dwt9s\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zpzvj\" (UID: \"6c2f2991-db4b-4a58-807a-f3b617d9542f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj" Dec 01 20:39:38 crc kubenswrapper[4802]: I1201 20:39:38.099656 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj" Dec 01 20:39:38 crc kubenswrapper[4802]: I1201 20:39:38.608607 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj"] Dec 01 20:39:38 crc kubenswrapper[4802]: I1201 20:39:38.714333 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj" event={"ID":"6c2f2991-db4b-4a58-807a-f3b617d9542f","Type":"ContainerStarted","Data":"7d73864b67a8cb220ab83ee853c141de10fa3f82e19712cab090a008b5068105"} Dec 01 20:39:39 crc kubenswrapper[4802]: I1201 20:39:39.725295 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj" event={"ID":"6c2f2991-db4b-4a58-807a-f3b617d9542f","Type":"ContainerStarted","Data":"4a3a8b32be765b74d883ed60ccbe797c57a0e63a19876ee64fd7c0a14bc56fab"} Dec 01 20:39:39 crc kubenswrapper[4802]: I1201 20:39:39.752972 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj" podStartSLOduration=2.268050714 podStartE2EDuration="2.752956584s" podCreationTimestamp="2025-12-01 20:39:37 +0000 UTC" firstStartedPulling="2025-12-01 20:39:38.610646191 +0000 UTC m=+2600.173205832" lastFinishedPulling="2025-12-01 20:39:39.095552071 +0000 UTC m=+2600.658111702" observedRunningTime="2025-12-01 20:39:39.744040297 +0000 UTC m=+2601.306599968" watchObservedRunningTime="2025-12-01 20:39:39.752956584 +0000 UTC m=+2601.315516225" Dec 01 20:40:46 crc kubenswrapper[4802]: I1201 20:40:46.322848 4802 generic.go:334] "Generic (PLEG): container finished" podID="6c2f2991-db4b-4a58-807a-f3b617d9542f" containerID="4a3a8b32be765b74d883ed60ccbe797c57a0e63a19876ee64fd7c0a14bc56fab" exitCode=0 Dec 01 20:40:46 crc kubenswrapper[4802]: I1201 20:40:46.322940 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj" event={"ID":"6c2f2991-db4b-4a58-807a-f3b617d9542f","Type":"ContainerDied","Data":"4a3a8b32be765b74d883ed60ccbe797c57a0e63a19876ee64fd7c0a14bc56fab"} Dec 01 20:40:47 crc kubenswrapper[4802]: I1201 20:40:47.695011 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj" Dec 01 20:40:47 crc kubenswrapper[4802]: I1201 20:40:47.770631 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6c2f2991-db4b-4a58-807a-f3b617d9542f-ceph\") pod \"6c2f2991-db4b-4a58-807a-f3b617d9542f\" (UID: \"6c2f2991-db4b-4a58-807a-f3b617d9542f\") " Dec 01 20:40:47 crc kubenswrapper[4802]: I1201 20:40:47.770760 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c2f2991-db4b-4a58-807a-f3b617d9542f-ovn-combined-ca-bundle\") pod \"6c2f2991-db4b-4a58-807a-f3b617d9542f\" (UID: \"6c2f2991-db4b-4a58-807a-f3b617d9542f\") " Dec 01 20:40:47 crc kubenswrapper[4802]: I1201 20:40:47.770858 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6c2f2991-db4b-4a58-807a-f3b617d9542f-ovncontroller-config-0\") pod \"6c2f2991-db4b-4a58-807a-f3b617d9542f\" (UID: \"6c2f2991-db4b-4a58-807a-f3b617d9542f\") " Dec 01 20:40:47 crc kubenswrapper[4802]: I1201 20:40:47.770892 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6c2f2991-db4b-4a58-807a-f3b617d9542f-ssh-key\") pod \"6c2f2991-db4b-4a58-807a-f3b617d9542f\" (UID: \"6c2f2991-db4b-4a58-807a-f3b617d9542f\") " Dec 01 20:40:47 crc kubenswrapper[4802]: I1201 20:40:47.770969 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwt9s\" (UniqueName: \"kubernetes.io/projected/6c2f2991-db4b-4a58-807a-f3b617d9542f-kube-api-access-dwt9s\") pod \"6c2f2991-db4b-4a58-807a-f3b617d9542f\" (UID: \"6c2f2991-db4b-4a58-807a-f3b617d9542f\") " Dec 01 20:40:47 crc kubenswrapper[4802]: I1201 20:40:47.771188 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c2f2991-db4b-4a58-807a-f3b617d9542f-inventory\") pod \"6c2f2991-db4b-4a58-807a-f3b617d9542f\" (UID: \"6c2f2991-db4b-4a58-807a-f3b617d9542f\") " Dec 01 20:40:47 crc kubenswrapper[4802]: I1201 20:40:47.776840 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c2f2991-db4b-4a58-807a-f3b617d9542f-ceph" (OuterVolumeSpecName: "ceph") pod "6c2f2991-db4b-4a58-807a-f3b617d9542f" (UID: "6c2f2991-db4b-4a58-807a-f3b617d9542f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:40:47 crc kubenswrapper[4802]: I1201 20:40:47.777365 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c2f2991-db4b-4a58-807a-f3b617d9542f-kube-api-access-dwt9s" (OuterVolumeSpecName: "kube-api-access-dwt9s") pod "6c2f2991-db4b-4a58-807a-f3b617d9542f" (UID: "6c2f2991-db4b-4a58-807a-f3b617d9542f"). InnerVolumeSpecName "kube-api-access-dwt9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:40:47 crc kubenswrapper[4802]: I1201 20:40:47.777687 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c2f2991-db4b-4a58-807a-f3b617d9542f-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "6c2f2991-db4b-4a58-807a-f3b617d9542f" (UID: "6c2f2991-db4b-4a58-807a-f3b617d9542f"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:40:47 crc kubenswrapper[4802]: I1201 20:40:47.794774 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c2f2991-db4b-4a58-807a-f3b617d9542f-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "6c2f2991-db4b-4a58-807a-f3b617d9542f" (UID: "6c2f2991-db4b-4a58-807a-f3b617d9542f"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:40:47 crc kubenswrapper[4802]: I1201 20:40:47.797523 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c2f2991-db4b-4a58-807a-f3b617d9542f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6c2f2991-db4b-4a58-807a-f3b617d9542f" (UID: "6c2f2991-db4b-4a58-807a-f3b617d9542f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:40:47 crc kubenswrapper[4802]: I1201 20:40:47.804412 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c2f2991-db4b-4a58-807a-f3b617d9542f-inventory" (OuterVolumeSpecName: "inventory") pod "6c2f2991-db4b-4a58-807a-f3b617d9542f" (UID: "6c2f2991-db4b-4a58-807a-f3b617d9542f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:40:47 crc kubenswrapper[4802]: I1201 20:40:47.873717 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwt9s\" (UniqueName: \"kubernetes.io/projected/6c2f2991-db4b-4a58-807a-f3b617d9542f-kube-api-access-dwt9s\") on node \"crc\" DevicePath \"\"" Dec 01 20:40:47 crc kubenswrapper[4802]: I1201 20:40:47.873750 4802 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c2f2991-db4b-4a58-807a-f3b617d9542f-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 20:40:47 crc kubenswrapper[4802]: I1201 20:40:47.873760 4802 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6c2f2991-db4b-4a58-807a-f3b617d9542f-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 20:40:47 crc kubenswrapper[4802]: I1201 20:40:47.873769 4802 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c2f2991-db4b-4a58-807a-f3b617d9542f-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:40:47 crc kubenswrapper[4802]: I1201 20:40:47.873778 4802 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6c2f2991-db4b-4a58-807a-f3b617d9542f-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 20:40:47 crc kubenswrapper[4802]: I1201 20:40:47.873786 4802 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6c2f2991-db4b-4a58-807a-f3b617d9542f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.342157 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj" event={"ID":"6c2f2991-db4b-4a58-807a-f3b617d9542f","Type":"ContainerDied","Data":"7d73864b67a8cb220ab83ee853c141de10fa3f82e19712cab090a008b5068105"} Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.342474 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d73864b67a8cb220ab83ee853c141de10fa3f82e19712cab090a008b5068105" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.342542 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zpzvj" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.430604 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7"] Dec 01 20:40:48 crc kubenswrapper[4802]: E1201 20:40:48.431074 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c2f2991-db4b-4a58-807a-f3b617d9542f" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.431092 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c2f2991-db4b-4a58-807a-f3b617d9542f" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.431301 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c2f2991-db4b-4a58-807a-f3b617d9542f" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.431877 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.437142 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.437397 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.437515 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.437895 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wx8v9" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.438127 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.438989 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.438999 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.442557 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7"] Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.483279 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7\" (UID: \"2112a496-9a70-408c-99ec-211c3ba2defe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.483355 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l28sj\" (UniqueName: \"kubernetes.io/projected/2112a496-9a70-408c-99ec-211c3ba2defe-kube-api-access-l28sj\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7\" (UID: \"2112a496-9a70-408c-99ec-211c3ba2defe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.483586 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7\" (UID: \"2112a496-9a70-408c-99ec-211c3ba2defe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.483770 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7\" (UID: \"2112a496-9a70-408c-99ec-211c3ba2defe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.483839 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7\" (UID: \"2112a496-9a70-408c-99ec-211c3ba2defe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.483883 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7\" (UID: \"2112a496-9a70-408c-99ec-211c3ba2defe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.483953 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7\" (UID: \"2112a496-9a70-408c-99ec-211c3ba2defe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.585727 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7\" (UID: \"2112a496-9a70-408c-99ec-211c3ba2defe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.585833 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l28sj\" (UniqueName: \"kubernetes.io/projected/2112a496-9a70-408c-99ec-211c3ba2defe-kube-api-access-l28sj\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7\" (UID: \"2112a496-9a70-408c-99ec-211c3ba2defe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.585958 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7\" (UID: \"2112a496-9a70-408c-99ec-211c3ba2defe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.586076 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7\" (UID: \"2112a496-9a70-408c-99ec-211c3ba2defe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.586121 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7\" (UID: \"2112a496-9a70-408c-99ec-211c3ba2defe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.586186 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7\" (UID: \"2112a496-9a70-408c-99ec-211c3ba2defe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.586277 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7\" (UID: \"2112a496-9a70-408c-99ec-211c3ba2defe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.592171 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7\" (UID: \"2112a496-9a70-408c-99ec-211c3ba2defe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.592441 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7\" (UID: \"2112a496-9a70-408c-99ec-211c3ba2defe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.592773 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7\" (UID: \"2112a496-9a70-408c-99ec-211c3ba2defe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.592852 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7\" (UID: \"2112a496-9a70-408c-99ec-211c3ba2defe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.595137 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7\" (UID: \"2112a496-9a70-408c-99ec-211c3ba2defe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.595139 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7\" (UID: \"2112a496-9a70-408c-99ec-211c3ba2defe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.617743 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l28sj\" (UniqueName: \"kubernetes.io/projected/2112a496-9a70-408c-99ec-211c3ba2defe-kube-api-access-l28sj\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7\" (UID: \"2112a496-9a70-408c-99ec-211c3ba2defe\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" Dec 01 20:40:48 crc kubenswrapper[4802]: I1201 20:40:48.747144 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" Dec 01 20:40:49 crc kubenswrapper[4802]: I1201 20:40:49.299457 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7"] Dec 01 20:40:49 crc kubenswrapper[4802]: W1201 20:40:49.300982 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2112a496_9a70_408c_99ec_211c3ba2defe.slice/crio-f5144c98c660b9b4daf0725cd0fd376ddd76401a3cf7ecf6d93f6d04444de981 WatchSource:0}: Error finding container f5144c98c660b9b4daf0725cd0fd376ddd76401a3cf7ecf6d93f6d04444de981: Status 404 returned error can't find the container with id f5144c98c660b9b4daf0725cd0fd376ddd76401a3cf7ecf6d93f6d04444de981 Dec 01 20:40:49 crc kubenswrapper[4802]: I1201 20:40:49.360583 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" event={"ID":"2112a496-9a70-408c-99ec-211c3ba2defe","Type":"ContainerStarted","Data":"f5144c98c660b9b4daf0725cd0fd376ddd76401a3cf7ecf6d93f6d04444de981"} Dec 01 20:40:50 crc kubenswrapper[4802]: I1201 20:40:50.369243 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" event={"ID":"2112a496-9a70-408c-99ec-211c3ba2defe","Type":"ContainerStarted","Data":"e2f7b5ccd36ecde2a7e0816b8b8e93e219a6107430b915be5247581d3f1e6fd2"} Dec 01 20:40:50 crc kubenswrapper[4802]: I1201 20:40:50.388796 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" podStartSLOduration=1.888940595 podStartE2EDuration="2.388779248s" podCreationTimestamp="2025-12-01 20:40:48 +0000 UTC" firstStartedPulling="2025-12-01 20:40:49.303882087 +0000 UTC m=+2670.866441748" lastFinishedPulling="2025-12-01 20:40:49.80372074 +0000 UTC m=+2671.366280401" observedRunningTime="2025-12-01 20:40:50.382711719 +0000 UTC m=+2671.945271360" watchObservedRunningTime="2025-12-01 20:40:50.388779248 +0000 UTC m=+2671.951338889" Dec 01 20:40:58 crc kubenswrapper[4802]: I1201 20:40:58.088672 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:40:58 crc kubenswrapper[4802]: I1201 20:40:58.090424 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:41:28 crc kubenswrapper[4802]: I1201 20:41:28.088855 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:41:28 crc kubenswrapper[4802]: I1201 20:41:28.089446 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:41:42 crc kubenswrapper[4802]: I1201 20:41:42.800797 4802 generic.go:334] "Generic (PLEG): container finished" podID="2112a496-9a70-408c-99ec-211c3ba2defe" containerID="e2f7b5ccd36ecde2a7e0816b8b8e93e219a6107430b915be5247581d3f1e6fd2" exitCode=0 Dec 01 20:41:42 crc kubenswrapper[4802]: I1201 20:41:42.800883 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" event={"ID":"2112a496-9a70-408c-99ec-211c3ba2defe","Type":"ContainerDied","Data":"e2f7b5ccd36ecde2a7e0816b8b8e93e219a6107430b915be5247581d3f1e6fd2"} Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.204064 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.359755 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-ceph\") pod \"2112a496-9a70-408c-99ec-211c3ba2defe\" (UID: \"2112a496-9a70-408c-99ec-211c3ba2defe\") " Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.360212 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-neutron-ovn-metadata-agent-neutron-config-0\") pod \"2112a496-9a70-408c-99ec-211c3ba2defe\" (UID: \"2112a496-9a70-408c-99ec-211c3ba2defe\") " Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.360258 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-nova-metadata-neutron-config-0\") pod \"2112a496-9a70-408c-99ec-211c3ba2defe\" (UID: \"2112a496-9a70-408c-99ec-211c3ba2defe\") " Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.360280 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-ssh-key\") pod \"2112a496-9a70-408c-99ec-211c3ba2defe\" (UID: \"2112a496-9a70-408c-99ec-211c3ba2defe\") " Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.360340 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l28sj\" (UniqueName: \"kubernetes.io/projected/2112a496-9a70-408c-99ec-211c3ba2defe-kube-api-access-l28sj\") pod \"2112a496-9a70-408c-99ec-211c3ba2defe\" (UID: \"2112a496-9a70-408c-99ec-211c3ba2defe\") " Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.360376 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-neutron-metadata-combined-ca-bundle\") pod \"2112a496-9a70-408c-99ec-211c3ba2defe\" (UID: \"2112a496-9a70-408c-99ec-211c3ba2defe\") " Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.360441 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-inventory\") pod \"2112a496-9a70-408c-99ec-211c3ba2defe\" (UID: \"2112a496-9a70-408c-99ec-211c3ba2defe\") " Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.366112 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-ceph" (OuterVolumeSpecName: "ceph") pod "2112a496-9a70-408c-99ec-211c3ba2defe" (UID: "2112a496-9a70-408c-99ec-211c3ba2defe"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.366366 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "2112a496-9a70-408c-99ec-211c3ba2defe" (UID: "2112a496-9a70-408c-99ec-211c3ba2defe"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.366907 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2112a496-9a70-408c-99ec-211c3ba2defe-kube-api-access-l28sj" (OuterVolumeSpecName: "kube-api-access-l28sj") pod "2112a496-9a70-408c-99ec-211c3ba2defe" (UID: "2112a496-9a70-408c-99ec-211c3ba2defe"). InnerVolumeSpecName "kube-api-access-l28sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.386932 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2112a496-9a70-408c-99ec-211c3ba2defe" (UID: "2112a496-9a70-408c-99ec-211c3ba2defe"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.388429 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "2112a496-9a70-408c-99ec-211c3ba2defe" (UID: "2112a496-9a70-408c-99ec-211c3ba2defe"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.391228 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-inventory" (OuterVolumeSpecName: "inventory") pod "2112a496-9a70-408c-99ec-211c3ba2defe" (UID: "2112a496-9a70-408c-99ec-211c3ba2defe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.393566 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "2112a496-9a70-408c-99ec-211c3ba2defe" (UID: "2112a496-9a70-408c-99ec-211c3ba2defe"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.462525 4802 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.463044 4802 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.463062 4802 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.463073 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l28sj\" (UniqueName: \"kubernetes.io/projected/2112a496-9a70-408c-99ec-211c3ba2defe-kube-api-access-l28sj\") on node \"crc\" DevicePath \"\"" Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.463085 4802 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.463093 4802 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.463102 4802 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2112a496-9a70-408c-99ec-211c3ba2defe-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.826901 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" event={"ID":"2112a496-9a70-408c-99ec-211c3ba2defe","Type":"ContainerDied","Data":"f5144c98c660b9b4daf0725cd0fd376ddd76401a3cf7ecf6d93f6d04444de981"} Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.827164 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5144c98c660b9b4daf0725cd0fd376ddd76401a3cf7ecf6d93f6d04444de981" Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.826952 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7" Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.905178 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc"] Dec 01 20:41:44 crc kubenswrapper[4802]: E1201 20:41:44.905546 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2112a496-9a70-408c-99ec-211c3ba2defe" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.905565 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="2112a496-9a70-408c-99ec-211c3ba2defe" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.905734 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="2112a496-9a70-408c-99ec-211c3ba2defe" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.906300 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc" Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.907913 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.908029 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.908491 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.909077 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.909333 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wx8v9" Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.909364 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 20:41:44 crc kubenswrapper[4802]: I1201 20:41:44.922598 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc"] Dec 01 20:41:45 crc kubenswrapper[4802]: I1201 20:41:45.082811 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc\" (UID: \"bad07c68-596f-44ca-9580-335176bd8049\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc" Dec 01 20:41:45 crc kubenswrapper[4802]: I1201 20:41:45.082944 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc\" (UID: \"bad07c68-596f-44ca-9580-335176bd8049\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc" Dec 01 20:41:45 crc kubenswrapper[4802]: I1201 20:41:45.083043 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc\" (UID: \"bad07c68-596f-44ca-9580-335176bd8049\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc" Dec 01 20:41:45 crc kubenswrapper[4802]: I1201 20:41:45.083076 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc\" (UID: \"bad07c68-596f-44ca-9580-335176bd8049\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc" Dec 01 20:41:45 crc kubenswrapper[4802]: I1201 20:41:45.083110 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcvv6\" (UniqueName: \"kubernetes.io/projected/bad07c68-596f-44ca-9580-335176bd8049-kube-api-access-rcvv6\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc\" (UID: \"bad07c68-596f-44ca-9580-335176bd8049\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc" Dec 01 20:41:45 crc kubenswrapper[4802]: I1201 20:41:45.083134 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc\" (UID: \"bad07c68-596f-44ca-9580-335176bd8049\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc" Dec 01 20:41:45 crc kubenswrapper[4802]: I1201 20:41:45.184685 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc\" (UID: \"bad07c68-596f-44ca-9580-335176bd8049\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc" Dec 01 20:41:45 crc kubenswrapper[4802]: I1201 20:41:45.185081 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc\" (UID: \"bad07c68-596f-44ca-9580-335176bd8049\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc" Dec 01 20:41:45 crc kubenswrapper[4802]: I1201 20:41:45.185147 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcvv6\" (UniqueName: \"kubernetes.io/projected/bad07c68-596f-44ca-9580-335176bd8049-kube-api-access-rcvv6\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc\" (UID: \"bad07c68-596f-44ca-9580-335176bd8049\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc" Dec 01 20:41:45 crc kubenswrapper[4802]: I1201 20:41:45.185187 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc\" (UID: \"bad07c68-596f-44ca-9580-335176bd8049\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc" Dec 01 20:41:45 crc kubenswrapper[4802]: I1201 20:41:45.185269 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc\" (UID: \"bad07c68-596f-44ca-9580-335176bd8049\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc" Dec 01 20:41:45 crc kubenswrapper[4802]: I1201 20:41:45.185321 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc\" (UID: \"bad07c68-596f-44ca-9580-335176bd8049\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc" Dec 01 20:41:45 crc kubenswrapper[4802]: I1201 20:41:45.188869 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc\" (UID: \"bad07c68-596f-44ca-9580-335176bd8049\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc" Dec 01 20:41:45 crc kubenswrapper[4802]: I1201 20:41:45.189161 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc\" (UID: \"bad07c68-596f-44ca-9580-335176bd8049\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc" Dec 01 20:41:45 crc kubenswrapper[4802]: I1201 20:41:45.189350 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc\" (UID: \"bad07c68-596f-44ca-9580-335176bd8049\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc" Dec 01 20:41:45 crc kubenswrapper[4802]: I1201 20:41:45.189463 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc\" (UID: \"bad07c68-596f-44ca-9580-335176bd8049\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc" Dec 01 20:41:45 crc kubenswrapper[4802]: I1201 20:41:45.193891 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc\" (UID: \"bad07c68-596f-44ca-9580-335176bd8049\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc" Dec 01 20:41:45 crc kubenswrapper[4802]: I1201 20:41:45.202037 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcvv6\" (UniqueName: \"kubernetes.io/projected/bad07c68-596f-44ca-9580-335176bd8049-kube-api-access-rcvv6\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc\" (UID: \"bad07c68-596f-44ca-9580-335176bd8049\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc" Dec 01 20:41:45 crc kubenswrapper[4802]: I1201 20:41:45.228445 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc" Dec 01 20:41:45 crc kubenswrapper[4802]: I1201 20:41:45.766073 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc"] Dec 01 20:41:45 crc kubenswrapper[4802]: W1201 20:41:45.767547 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbad07c68_596f_44ca_9580_335176bd8049.slice/crio-5a2e4b55be795781ec1ad5b73cec87a3af12d5cbc101d0fb91ebbd9cbe253289 WatchSource:0}: Error finding container 5a2e4b55be795781ec1ad5b73cec87a3af12d5cbc101d0fb91ebbd9cbe253289: Status 404 returned error can't find the container with id 5a2e4b55be795781ec1ad5b73cec87a3af12d5cbc101d0fb91ebbd9cbe253289 Dec 01 20:41:45 crc kubenswrapper[4802]: I1201 20:41:45.770740 4802 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 20:41:45 crc kubenswrapper[4802]: I1201 20:41:45.835139 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc" event={"ID":"bad07c68-596f-44ca-9580-335176bd8049","Type":"ContainerStarted","Data":"5a2e4b55be795781ec1ad5b73cec87a3af12d5cbc101d0fb91ebbd9cbe253289"} Dec 01 20:41:46 crc kubenswrapper[4802]: I1201 20:41:46.845514 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc" event={"ID":"bad07c68-596f-44ca-9580-335176bd8049","Type":"ContainerStarted","Data":"b432d8810475e34225b9269ed0a642d90b28dd8c601b51cf0c4d6e07cd994535"} Dec 01 20:41:46 crc kubenswrapper[4802]: I1201 20:41:46.870915 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc" podStartSLOduration=2.367831836 podStartE2EDuration="2.870887988s" podCreationTimestamp="2025-12-01 20:41:44 +0000 UTC" firstStartedPulling="2025-12-01 20:41:45.770502477 +0000 UTC m=+2727.333062118" lastFinishedPulling="2025-12-01 20:41:46.273558609 +0000 UTC m=+2727.836118270" observedRunningTime="2025-12-01 20:41:46.861770155 +0000 UTC m=+2728.424329806" watchObservedRunningTime="2025-12-01 20:41:46.870887988 +0000 UTC m=+2728.433447629" Dec 01 20:41:58 crc kubenswrapper[4802]: I1201 20:41:58.088519 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:41:58 crc kubenswrapper[4802]: I1201 20:41:58.089100 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:41:58 crc kubenswrapper[4802]: I1201 20:41:58.089150 4802 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 20:41:58 crc kubenswrapper[4802]: I1201 20:41:58.089980 4802 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4b52124a617ff9aff5ae5f96828f2e1b5ca32a7fd19dd6889c883a07f00e9bf3"} pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 20:41:58 crc kubenswrapper[4802]: I1201 20:41:58.090031 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" containerID="cri-o://4b52124a617ff9aff5ae5f96828f2e1b5ca32a7fd19dd6889c883a07f00e9bf3" gracePeriod=600 Dec 01 20:41:58 crc kubenswrapper[4802]: I1201 20:41:58.948453 4802 generic.go:334] "Generic (PLEG): container finished" podID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerID="4b52124a617ff9aff5ae5f96828f2e1b5ca32a7fd19dd6889c883a07f00e9bf3" exitCode=0 Dec 01 20:41:58 crc kubenswrapper[4802]: I1201 20:41:58.948560 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerDied","Data":"4b52124a617ff9aff5ae5f96828f2e1b5ca32a7fd19dd6889c883a07f00e9bf3"} Dec 01 20:41:58 crc kubenswrapper[4802]: I1201 20:41:58.948999 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerStarted","Data":"538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098"} Dec 01 20:41:58 crc kubenswrapper[4802]: I1201 20:41:58.949023 4802 scope.go:117] "RemoveContainer" containerID="e47d82d09547c27669fba0f56ba973a75f79c77d4e5a848baa5915e686ca6c88" Dec 01 20:42:18 crc kubenswrapper[4802]: I1201 20:42:18.224685 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bs7w6"] Dec 01 20:42:18 crc kubenswrapper[4802]: I1201 20:42:18.233813 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bs7w6" Dec 01 20:42:18 crc kubenswrapper[4802]: I1201 20:42:18.253109 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bs7w6"] Dec 01 20:42:18 crc kubenswrapper[4802]: I1201 20:42:18.320301 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e088cf2-ed76-422b-9cff-62e05009f281-catalog-content\") pod \"redhat-operators-bs7w6\" (UID: \"3e088cf2-ed76-422b-9cff-62e05009f281\") " pod="openshift-marketplace/redhat-operators-bs7w6" Dec 01 20:42:18 crc kubenswrapper[4802]: I1201 20:42:18.320347 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftx68\" (UniqueName: \"kubernetes.io/projected/3e088cf2-ed76-422b-9cff-62e05009f281-kube-api-access-ftx68\") pod \"redhat-operators-bs7w6\" (UID: \"3e088cf2-ed76-422b-9cff-62e05009f281\") " pod="openshift-marketplace/redhat-operators-bs7w6" Dec 01 20:42:18 crc kubenswrapper[4802]: I1201 20:42:18.320372 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e088cf2-ed76-422b-9cff-62e05009f281-utilities\") pod \"redhat-operators-bs7w6\" (UID: \"3e088cf2-ed76-422b-9cff-62e05009f281\") " pod="openshift-marketplace/redhat-operators-bs7w6" Dec 01 20:42:18 crc kubenswrapper[4802]: I1201 20:42:18.421424 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e088cf2-ed76-422b-9cff-62e05009f281-catalog-content\") pod \"redhat-operators-bs7w6\" (UID: \"3e088cf2-ed76-422b-9cff-62e05009f281\") " pod="openshift-marketplace/redhat-operators-bs7w6" Dec 01 20:42:18 crc kubenswrapper[4802]: I1201 20:42:18.421466 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftx68\" (UniqueName: \"kubernetes.io/projected/3e088cf2-ed76-422b-9cff-62e05009f281-kube-api-access-ftx68\") pod \"redhat-operators-bs7w6\" (UID: \"3e088cf2-ed76-422b-9cff-62e05009f281\") " pod="openshift-marketplace/redhat-operators-bs7w6" Dec 01 20:42:18 crc kubenswrapper[4802]: I1201 20:42:18.421510 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e088cf2-ed76-422b-9cff-62e05009f281-utilities\") pod \"redhat-operators-bs7w6\" (UID: \"3e088cf2-ed76-422b-9cff-62e05009f281\") " pod="openshift-marketplace/redhat-operators-bs7w6" Dec 01 20:42:18 crc kubenswrapper[4802]: I1201 20:42:18.421993 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e088cf2-ed76-422b-9cff-62e05009f281-catalog-content\") pod \"redhat-operators-bs7w6\" (UID: \"3e088cf2-ed76-422b-9cff-62e05009f281\") " pod="openshift-marketplace/redhat-operators-bs7w6" Dec 01 20:42:18 crc kubenswrapper[4802]: I1201 20:42:18.422029 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e088cf2-ed76-422b-9cff-62e05009f281-utilities\") pod \"redhat-operators-bs7w6\" (UID: \"3e088cf2-ed76-422b-9cff-62e05009f281\") " pod="openshift-marketplace/redhat-operators-bs7w6" Dec 01 20:42:18 crc kubenswrapper[4802]: I1201 20:42:18.440166 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftx68\" (UniqueName: \"kubernetes.io/projected/3e088cf2-ed76-422b-9cff-62e05009f281-kube-api-access-ftx68\") pod \"redhat-operators-bs7w6\" (UID: \"3e088cf2-ed76-422b-9cff-62e05009f281\") " pod="openshift-marketplace/redhat-operators-bs7w6" Dec 01 20:42:18 crc kubenswrapper[4802]: I1201 20:42:18.580084 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bs7w6" Dec 01 20:42:19 crc kubenswrapper[4802]: I1201 20:42:19.083462 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bs7w6"] Dec 01 20:42:19 crc kubenswrapper[4802]: I1201 20:42:19.116538 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs7w6" event={"ID":"3e088cf2-ed76-422b-9cff-62e05009f281","Type":"ContainerStarted","Data":"d425891e81f8f2c0ed5c4e5cb6421eec9cd356d4a5ecc4f7ed8f6eeed902164b"} Dec 01 20:42:20 crc kubenswrapper[4802]: I1201 20:42:20.125814 4802 generic.go:334] "Generic (PLEG): container finished" podID="3e088cf2-ed76-422b-9cff-62e05009f281" containerID="433a8ed5893adf582b7e1ad6b0fdee9aa42d2a0eb474fa5b78bfa435c47bd647" exitCode=0 Dec 01 20:42:20 crc kubenswrapper[4802]: I1201 20:42:20.125936 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs7w6" event={"ID":"3e088cf2-ed76-422b-9cff-62e05009f281","Type":"ContainerDied","Data":"433a8ed5893adf582b7e1ad6b0fdee9aa42d2a0eb474fa5b78bfa435c47bd647"} Dec 01 20:42:22 crc kubenswrapper[4802]: I1201 20:42:22.153257 4802 generic.go:334] "Generic (PLEG): container finished" podID="3e088cf2-ed76-422b-9cff-62e05009f281" containerID="bfe704dd2517fa04c0c4f2bedc1d79e156c97c6535e8c7aba96a89d45f1eadf3" exitCode=0 Dec 01 20:42:22 crc kubenswrapper[4802]: I1201 20:42:22.153306 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs7w6" event={"ID":"3e088cf2-ed76-422b-9cff-62e05009f281","Type":"ContainerDied","Data":"bfe704dd2517fa04c0c4f2bedc1d79e156c97c6535e8c7aba96a89d45f1eadf3"} Dec 01 20:42:31 crc kubenswrapper[4802]: I1201 20:42:31.233769 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs7w6" event={"ID":"3e088cf2-ed76-422b-9cff-62e05009f281","Type":"ContainerStarted","Data":"d51e6a07d769668e2789475129395c666bd1788237a8a96d1760fcff5abcb56d"} Dec 01 20:42:32 crc kubenswrapper[4802]: I1201 20:42:32.280021 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bs7w6" podStartSLOduration=3.604748583 podStartE2EDuration="14.279999131s" podCreationTimestamp="2025-12-01 20:42:18 +0000 UTC" firstStartedPulling="2025-12-01 20:42:20.129772275 +0000 UTC m=+2761.692331916" lastFinishedPulling="2025-12-01 20:42:30.805022813 +0000 UTC m=+2772.367582464" observedRunningTime="2025-12-01 20:42:32.273282812 +0000 UTC m=+2773.835842543" watchObservedRunningTime="2025-12-01 20:42:32.279999131 +0000 UTC m=+2773.842558772" Dec 01 20:42:38 crc kubenswrapper[4802]: I1201 20:42:38.581078 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bs7w6" Dec 01 20:42:38 crc kubenswrapper[4802]: I1201 20:42:38.582363 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bs7w6" Dec 01 20:42:38 crc kubenswrapper[4802]: I1201 20:42:38.622841 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bs7w6" Dec 01 20:42:39 crc kubenswrapper[4802]: I1201 20:42:39.402133 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bs7w6" Dec 01 20:42:39 crc kubenswrapper[4802]: I1201 20:42:39.450799 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bs7w6"] Dec 01 20:42:41 crc kubenswrapper[4802]: I1201 20:42:41.368095 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bs7w6" podUID="3e088cf2-ed76-422b-9cff-62e05009f281" containerName="registry-server" containerID="cri-o://d51e6a07d769668e2789475129395c666bd1788237a8a96d1760fcff5abcb56d" gracePeriod=2 Dec 01 20:42:41 crc kubenswrapper[4802]: I1201 20:42:41.795163 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bs7w6" Dec 01 20:42:41 crc kubenswrapper[4802]: I1201 20:42:41.904022 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e088cf2-ed76-422b-9cff-62e05009f281-utilities\") pod \"3e088cf2-ed76-422b-9cff-62e05009f281\" (UID: \"3e088cf2-ed76-422b-9cff-62e05009f281\") " Dec 01 20:42:41 crc kubenswrapper[4802]: I1201 20:42:41.904086 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e088cf2-ed76-422b-9cff-62e05009f281-catalog-content\") pod \"3e088cf2-ed76-422b-9cff-62e05009f281\" (UID: \"3e088cf2-ed76-422b-9cff-62e05009f281\") " Dec 01 20:42:41 crc kubenswrapper[4802]: I1201 20:42:41.904295 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftx68\" (UniqueName: \"kubernetes.io/projected/3e088cf2-ed76-422b-9cff-62e05009f281-kube-api-access-ftx68\") pod \"3e088cf2-ed76-422b-9cff-62e05009f281\" (UID: \"3e088cf2-ed76-422b-9cff-62e05009f281\") " Dec 01 20:42:41 crc kubenswrapper[4802]: I1201 20:42:41.905002 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e088cf2-ed76-422b-9cff-62e05009f281-utilities" (OuterVolumeSpecName: "utilities") pod "3e088cf2-ed76-422b-9cff-62e05009f281" (UID: "3e088cf2-ed76-422b-9cff-62e05009f281"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:42:41 crc kubenswrapper[4802]: I1201 20:42:41.909550 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e088cf2-ed76-422b-9cff-62e05009f281-kube-api-access-ftx68" (OuterVolumeSpecName: "kube-api-access-ftx68") pod "3e088cf2-ed76-422b-9cff-62e05009f281" (UID: "3e088cf2-ed76-422b-9cff-62e05009f281"). InnerVolumeSpecName "kube-api-access-ftx68". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:42:42 crc kubenswrapper[4802]: I1201 20:42:42.005775 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e088cf2-ed76-422b-9cff-62e05009f281-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e088cf2-ed76-422b-9cff-62e05009f281" (UID: "3e088cf2-ed76-422b-9cff-62e05009f281"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:42:42 crc kubenswrapper[4802]: I1201 20:42:42.006443 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e088cf2-ed76-422b-9cff-62e05009f281-catalog-content\") pod \"3e088cf2-ed76-422b-9cff-62e05009f281\" (UID: \"3e088cf2-ed76-422b-9cff-62e05009f281\") " Dec 01 20:42:42 crc kubenswrapper[4802]: W1201 20:42:42.006586 4802 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/3e088cf2-ed76-422b-9cff-62e05009f281/volumes/kubernetes.io~empty-dir/catalog-content Dec 01 20:42:42 crc kubenswrapper[4802]: I1201 20:42:42.006611 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e088cf2-ed76-422b-9cff-62e05009f281-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e088cf2-ed76-422b-9cff-62e05009f281" (UID: "3e088cf2-ed76-422b-9cff-62e05009f281"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:42:42 crc kubenswrapper[4802]: I1201 20:42:42.006962 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e088cf2-ed76-422b-9cff-62e05009f281-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 20:42:42 crc kubenswrapper[4802]: I1201 20:42:42.006985 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e088cf2-ed76-422b-9cff-62e05009f281-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 20:42:42 crc kubenswrapper[4802]: I1201 20:42:42.007004 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftx68\" (UniqueName: \"kubernetes.io/projected/3e088cf2-ed76-422b-9cff-62e05009f281-kube-api-access-ftx68\") on node \"crc\" DevicePath \"\"" Dec 01 20:42:42 crc kubenswrapper[4802]: I1201 20:42:42.378556 4802 generic.go:334] "Generic (PLEG): container finished" podID="3e088cf2-ed76-422b-9cff-62e05009f281" containerID="d51e6a07d769668e2789475129395c666bd1788237a8a96d1760fcff5abcb56d" exitCode=0 Dec 01 20:42:42 crc kubenswrapper[4802]: I1201 20:42:42.378620 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs7w6" event={"ID":"3e088cf2-ed76-422b-9cff-62e05009f281","Type":"ContainerDied","Data":"d51e6a07d769668e2789475129395c666bd1788237a8a96d1760fcff5abcb56d"} Dec 01 20:42:42 crc kubenswrapper[4802]: I1201 20:42:42.378659 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs7w6" event={"ID":"3e088cf2-ed76-422b-9cff-62e05009f281","Type":"ContainerDied","Data":"d425891e81f8f2c0ed5c4e5cb6421eec9cd356d4a5ecc4f7ed8f6eeed902164b"} Dec 01 20:42:42 crc kubenswrapper[4802]: I1201 20:42:42.378683 4802 scope.go:117] "RemoveContainer" containerID="d51e6a07d769668e2789475129395c666bd1788237a8a96d1760fcff5abcb56d" Dec 01 20:42:42 crc kubenswrapper[4802]: I1201 20:42:42.378881 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bs7w6" Dec 01 20:42:42 crc kubenswrapper[4802]: I1201 20:42:42.412443 4802 scope.go:117] "RemoveContainer" containerID="bfe704dd2517fa04c0c4f2bedc1d79e156c97c6535e8c7aba96a89d45f1eadf3" Dec 01 20:42:42 crc kubenswrapper[4802]: I1201 20:42:42.423028 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bs7w6"] Dec 01 20:42:42 crc kubenswrapper[4802]: I1201 20:42:42.432647 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bs7w6"] Dec 01 20:42:42 crc kubenswrapper[4802]: I1201 20:42:42.435684 4802 scope.go:117] "RemoveContainer" containerID="433a8ed5893adf582b7e1ad6b0fdee9aa42d2a0eb474fa5b78bfa435c47bd647" Dec 01 20:42:42 crc kubenswrapper[4802]: I1201 20:42:42.484611 4802 scope.go:117] "RemoveContainer" containerID="d51e6a07d769668e2789475129395c666bd1788237a8a96d1760fcff5abcb56d" Dec 01 20:42:42 crc kubenswrapper[4802]: E1201 20:42:42.485090 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d51e6a07d769668e2789475129395c666bd1788237a8a96d1760fcff5abcb56d\": container with ID starting with d51e6a07d769668e2789475129395c666bd1788237a8a96d1760fcff5abcb56d not found: ID does not exist" containerID="d51e6a07d769668e2789475129395c666bd1788237a8a96d1760fcff5abcb56d" Dec 01 20:42:42 crc kubenswrapper[4802]: I1201 20:42:42.485126 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d51e6a07d769668e2789475129395c666bd1788237a8a96d1760fcff5abcb56d"} err="failed to get container status \"d51e6a07d769668e2789475129395c666bd1788237a8a96d1760fcff5abcb56d\": rpc error: code = NotFound desc = could not find container \"d51e6a07d769668e2789475129395c666bd1788237a8a96d1760fcff5abcb56d\": container with ID starting with d51e6a07d769668e2789475129395c666bd1788237a8a96d1760fcff5abcb56d not found: ID does not exist" Dec 01 20:42:42 crc kubenswrapper[4802]: I1201 20:42:42.485164 4802 scope.go:117] "RemoveContainer" containerID="bfe704dd2517fa04c0c4f2bedc1d79e156c97c6535e8c7aba96a89d45f1eadf3" Dec 01 20:42:42 crc kubenswrapper[4802]: E1201 20:42:42.485673 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfe704dd2517fa04c0c4f2bedc1d79e156c97c6535e8c7aba96a89d45f1eadf3\": container with ID starting with bfe704dd2517fa04c0c4f2bedc1d79e156c97c6535e8c7aba96a89d45f1eadf3 not found: ID does not exist" containerID="bfe704dd2517fa04c0c4f2bedc1d79e156c97c6535e8c7aba96a89d45f1eadf3" Dec 01 20:42:42 crc kubenswrapper[4802]: I1201 20:42:42.485707 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfe704dd2517fa04c0c4f2bedc1d79e156c97c6535e8c7aba96a89d45f1eadf3"} err="failed to get container status \"bfe704dd2517fa04c0c4f2bedc1d79e156c97c6535e8c7aba96a89d45f1eadf3\": rpc error: code = NotFound desc = could not find container \"bfe704dd2517fa04c0c4f2bedc1d79e156c97c6535e8c7aba96a89d45f1eadf3\": container with ID starting with bfe704dd2517fa04c0c4f2bedc1d79e156c97c6535e8c7aba96a89d45f1eadf3 not found: ID does not exist" Dec 01 20:42:42 crc kubenswrapper[4802]: I1201 20:42:42.485737 4802 scope.go:117] "RemoveContainer" containerID="433a8ed5893adf582b7e1ad6b0fdee9aa42d2a0eb474fa5b78bfa435c47bd647" Dec 01 20:42:42 crc kubenswrapper[4802]: E1201 20:42:42.486022 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"433a8ed5893adf582b7e1ad6b0fdee9aa42d2a0eb474fa5b78bfa435c47bd647\": container with ID starting with 433a8ed5893adf582b7e1ad6b0fdee9aa42d2a0eb474fa5b78bfa435c47bd647 not found: ID does not exist" containerID="433a8ed5893adf582b7e1ad6b0fdee9aa42d2a0eb474fa5b78bfa435c47bd647" Dec 01 20:42:42 crc kubenswrapper[4802]: I1201 20:42:42.486043 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"433a8ed5893adf582b7e1ad6b0fdee9aa42d2a0eb474fa5b78bfa435c47bd647"} err="failed to get container status \"433a8ed5893adf582b7e1ad6b0fdee9aa42d2a0eb474fa5b78bfa435c47bd647\": rpc error: code = NotFound desc = could not find container \"433a8ed5893adf582b7e1ad6b0fdee9aa42d2a0eb474fa5b78bfa435c47bd647\": container with ID starting with 433a8ed5893adf582b7e1ad6b0fdee9aa42d2a0eb474fa5b78bfa435c47bd647 not found: ID does not exist" Dec 01 20:42:42 crc kubenswrapper[4802]: I1201 20:42:42.730016 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e088cf2-ed76-422b-9cff-62e05009f281" path="/var/lib/kubelet/pods/3e088cf2-ed76-422b-9cff-62e05009f281/volumes" Dec 01 20:43:58 crc kubenswrapper[4802]: I1201 20:43:58.088564 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:43:58 crc kubenswrapper[4802]: I1201 20:43:58.089107 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:44:27 crc kubenswrapper[4802]: I1201 20:44:27.771883 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-77k4n"] Dec 01 20:44:27 crc kubenswrapper[4802]: E1201 20:44:27.772936 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e088cf2-ed76-422b-9cff-62e05009f281" containerName="extract-content" Dec 01 20:44:27 crc kubenswrapper[4802]: I1201 20:44:27.772951 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e088cf2-ed76-422b-9cff-62e05009f281" containerName="extract-content" Dec 01 20:44:27 crc kubenswrapper[4802]: E1201 20:44:27.772963 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e088cf2-ed76-422b-9cff-62e05009f281" containerName="extract-utilities" Dec 01 20:44:27 crc kubenswrapper[4802]: I1201 20:44:27.772969 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e088cf2-ed76-422b-9cff-62e05009f281" containerName="extract-utilities" Dec 01 20:44:27 crc kubenswrapper[4802]: E1201 20:44:27.772991 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e088cf2-ed76-422b-9cff-62e05009f281" containerName="registry-server" Dec 01 20:44:27 crc kubenswrapper[4802]: I1201 20:44:27.772997 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e088cf2-ed76-422b-9cff-62e05009f281" containerName="registry-server" Dec 01 20:44:27 crc kubenswrapper[4802]: I1201 20:44:27.773173 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e088cf2-ed76-422b-9cff-62e05009f281" containerName="registry-server" Dec 01 20:44:27 crc kubenswrapper[4802]: I1201 20:44:27.774523 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77k4n" Dec 01 20:44:27 crc kubenswrapper[4802]: I1201 20:44:27.782396 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-77k4n"] Dec 01 20:44:27 crc kubenswrapper[4802]: I1201 20:44:27.968336 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e551ade9-177c-4916-8a41-5c2efbe47e0b-catalog-content\") pod \"certified-operators-77k4n\" (UID: \"e551ade9-177c-4916-8a41-5c2efbe47e0b\") " pod="openshift-marketplace/certified-operators-77k4n" Dec 01 20:44:27 crc kubenswrapper[4802]: I1201 20:44:27.968503 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnx67\" (UniqueName: \"kubernetes.io/projected/e551ade9-177c-4916-8a41-5c2efbe47e0b-kube-api-access-hnx67\") pod \"certified-operators-77k4n\" (UID: \"e551ade9-177c-4916-8a41-5c2efbe47e0b\") " pod="openshift-marketplace/certified-operators-77k4n" Dec 01 20:44:27 crc kubenswrapper[4802]: I1201 20:44:27.968574 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e551ade9-177c-4916-8a41-5c2efbe47e0b-utilities\") pod \"certified-operators-77k4n\" (UID: \"e551ade9-177c-4916-8a41-5c2efbe47e0b\") " pod="openshift-marketplace/certified-operators-77k4n" Dec 01 20:44:28 crc kubenswrapper[4802]: I1201 20:44:28.070308 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnx67\" (UniqueName: \"kubernetes.io/projected/e551ade9-177c-4916-8a41-5c2efbe47e0b-kube-api-access-hnx67\") pod \"certified-operators-77k4n\" (UID: \"e551ade9-177c-4916-8a41-5c2efbe47e0b\") " pod="openshift-marketplace/certified-operators-77k4n" Dec 01 20:44:28 crc kubenswrapper[4802]: I1201 20:44:28.070410 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e551ade9-177c-4916-8a41-5c2efbe47e0b-utilities\") pod \"certified-operators-77k4n\" (UID: \"e551ade9-177c-4916-8a41-5c2efbe47e0b\") " pod="openshift-marketplace/certified-operators-77k4n" Dec 01 20:44:28 crc kubenswrapper[4802]: I1201 20:44:28.070482 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e551ade9-177c-4916-8a41-5c2efbe47e0b-catalog-content\") pod \"certified-operators-77k4n\" (UID: \"e551ade9-177c-4916-8a41-5c2efbe47e0b\") " pod="openshift-marketplace/certified-operators-77k4n" Dec 01 20:44:28 crc kubenswrapper[4802]: I1201 20:44:28.070895 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e551ade9-177c-4916-8a41-5c2efbe47e0b-utilities\") pod \"certified-operators-77k4n\" (UID: \"e551ade9-177c-4916-8a41-5c2efbe47e0b\") " pod="openshift-marketplace/certified-operators-77k4n" Dec 01 20:44:28 crc kubenswrapper[4802]: I1201 20:44:28.071034 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e551ade9-177c-4916-8a41-5c2efbe47e0b-catalog-content\") pod \"certified-operators-77k4n\" (UID: \"e551ade9-177c-4916-8a41-5c2efbe47e0b\") " pod="openshift-marketplace/certified-operators-77k4n" Dec 01 20:44:28 crc kubenswrapper[4802]: I1201 20:44:28.089647 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnx67\" (UniqueName: \"kubernetes.io/projected/e551ade9-177c-4916-8a41-5c2efbe47e0b-kube-api-access-hnx67\") pod \"certified-operators-77k4n\" (UID: \"e551ade9-177c-4916-8a41-5c2efbe47e0b\") " pod="openshift-marketplace/certified-operators-77k4n" Dec 01 20:44:28 crc kubenswrapper[4802]: I1201 20:44:28.090398 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:44:28 crc kubenswrapper[4802]: I1201 20:44:28.090457 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:44:28 crc kubenswrapper[4802]: I1201 20:44:28.099300 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77k4n" Dec 01 20:44:28 crc kubenswrapper[4802]: I1201 20:44:28.624998 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-77k4n"] Dec 01 20:44:29 crc kubenswrapper[4802]: I1201 20:44:29.306373 4802 generic.go:334] "Generic (PLEG): container finished" podID="e551ade9-177c-4916-8a41-5c2efbe47e0b" containerID="02610f69c253997652936332e9388cf5f74f4810063a23e9ad5c3536688281cf" exitCode=0 Dec 01 20:44:29 crc kubenswrapper[4802]: I1201 20:44:29.306693 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77k4n" event={"ID":"e551ade9-177c-4916-8a41-5c2efbe47e0b","Type":"ContainerDied","Data":"02610f69c253997652936332e9388cf5f74f4810063a23e9ad5c3536688281cf"} Dec 01 20:44:29 crc kubenswrapper[4802]: I1201 20:44:29.306767 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77k4n" event={"ID":"e551ade9-177c-4916-8a41-5c2efbe47e0b","Type":"ContainerStarted","Data":"078a9e67bde5650950a3388a28bd2bcacdba1016298b86883d982704a2418c65"} Dec 01 20:44:30 crc kubenswrapper[4802]: I1201 20:44:30.317316 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77k4n" event={"ID":"e551ade9-177c-4916-8a41-5c2efbe47e0b","Type":"ContainerStarted","Data":"f239c3a7778c830b651ad79b2b58ccf4551d162e21ff32765303bbb368ee695a"} Dec 01 20:44:31 crc kubenswrapper[4802]: I1201 20:44:31.329552 4802 generic.go:334] "Generic (PLEG): container finished" podID="e551ade9-177c-4916-8a41-5c2efbe47e0b" containerID="f239c3a7778c830b651ad79b2b58ccf4551d162e21ff32765303bbb368ee695a" exitCode=0 Dec 01 20:44:31 crc kubenswrapper[4802]: I1201 20:44:31.329616 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77k4n" event={"ID":"e551ade9-177c-4916-8a41-5c2efbe47e0b","Type":"ContainerDied","Data":"f239c3a7778c830b651ad79b2b58ccf4551d162e21ff32765303bbb368ee695a"} Dec 01 20:44:32 crc kubenswrapper[4802]: I1201 20:44:32.338386 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77k4n" event={"ID":"e551ade9-177c-4916-8a41-5c2efbe47e0b","Type":"ContainerStarted","Data":"7a4f7f7faa61da70f1a97296c209b86a5d54987c0d6def71bd44ac5f92432f50"} Dec 01 20:44:32 crc kubenswrapper[4802]: I1201 20:44:32.361225 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-77k4n" podStartSLOduration=2.672682234 podStartE2EDuration="5.361193572s" podCreationTimestamp="2025-12-01 20:44:27 +0000 UTC" firstStartedPulling="2025-12-01 20:44:29.30803063 +0000 UTC m=+2890.870590311" lastFinishedPulling="2025-12-01 20:44:31.996542008 +0000 UTC m=+2893.559101649" observedRunningTime="2025-12-01 20:44:32.3527471 +0000 UTC m=+2893.915306761" watchObservedRunningTime="2025-12-01 20:44:32.361193572 +0000 UTC m=+2893.923753213" Dec 01 20:44:38 crc kubenswrapper[4802]: I1201 20:44:38.100082 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-77k4n" Dec 01 20:44:38 crc kubenswrapper[4802]: I1201 20:44:38.100579 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-77k4n" Dec 01 20:44:38 crc kubenswrapper[4802]: I1201 20:44:38.170746 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-77k4n" Dec 01 20:44:38 crc kubenswrapper[4802]: I1201 20:44:38.432007 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-77k4n" Dec 01 20:44:38 crc kubenswrapper[4802]: I1201 20:44:38.479300 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-77k4n"] Dec 01 20:44:40 crc kubenswrapper[4802]: I1201 20:44:40.411035 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-77k4n" podUID="e551ade9-177c-4916-8a41-5c2efbe47e0b" containerName="registry-server" containerID="cri-o://7a4f7f7faa61da70f1a97296c209b86a5d54987c0d6def71bd44ac5f92432f50" gracePeriod=2 Dec 01 20:44:40 crc kubenswrapper[4802]: I1201 20:44:40.867602 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77k4n" Dec 01 20:44:40 crc kubenswrapper[4802]: I1201 20:44:40.897655 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnx67\" (UniqueName: \"kubernetes.io/projected/e551ade9-177c-4916-8a41-5c2efbe47e0b-kube-api-access-hnx67\") pod \"e551ade9-177c-4916-8a41-5c2efbe47e0b\" (UID: \"e551ade9-177c-4916-8a41-5c2efbe47e0b\") " Dec 01 20:44:40 crc kubenswrapper[4802]: I1201 20:44:40.897709 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e551ade9-177c-4916-8a41-5c2efbe47e0b-utilities\") pod \"e551ade9-177c-4916-8a41-5c2efbe47e0b\" (UID: \"e551ade9-177c-4916-8a41-5c2efbe47e0b\") " Dec 01 20:44:40 crc kubenswrapper[4802]: I1201 20:44:40.897771 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e551ade9-177c-4916-8a41-5c2efbe47e0b-catalog-content\") pod \"e551ade9-177c-4916-8a41-5c2efbe47e0b\" (UID: \"e551ade9-177c-4916-8a41-5c2efbe47e0b\") " Dec 01 20:44:40 crc kubenswrapper[4802]: I1201 20:44:40.899426 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e551ade9-177c-4916-8a41-5c2efbe47e0b-utilities" (OuterVolumeSpecName: "utilities") pod "e551ade9-177c-4916-8a41-5c2efbe47e0b" (UID: "e551ade9-177c-4916-8a41-5c2efbe47e0b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:44:40 crc kubenswrapper[4802]: I1201 20:44:40.903133 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e551ade9-177c-4916-8a41-5c2efbe47e0b-kube-api-access-hnx67" (OuterVolumeSpecName: "kube-api-access-hnx67") pod "e551ade9-177c-4916-8a41-5c2efbe47e0b" (UID: "e551ade9-177c-4916-8a41-5c2efbe47e0b"). InnerVolumeSpecName "kube-api-access-hnx67". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:44:40 crc kubenswrapper[4802]: I1201 20:44:40.999449 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnx67\" (UniqueName: \"kubernetes.io/projected/e551ade9-177c-4916-8a41-5c2efbe47e0b-kube-api-access-hnx67\") on node \"crc\" DevicePath \"\"" Dec 01 20:44:40 crc kubenswrapper[4802]: I1201 20:44:40.999479 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e551ade9-177c-4916-8a41-5c2efbe47e0b-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 20:44:41 crc kubenswrapper[4802]: I1201 20:44:41.262166 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e551ade9-177c-4916-8a41-5c2efbe47e0b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e551ade9-177c-4916-8a41-5c2efbe47e0b" (UID: "e551ade9-177c-4916-8a41-5c2efbe47e0b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:44:41 crc kubenswrapper[4802]: I1201 20:44:41.303410 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e551ade9-177c-4916-8a41-5c2efbe47e0b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 20:44:41 crc kubenswrapper[4802]: I1201 20:44:41.423100 4802 generic.go:334] "Generic (PLEG): container finished" podID="e551ade9-177c-4916-8a41-5c2efbe47e0b" containerID="7a4f7f7faa61da70f1a97296c209b86a5d54987c0d6def71bd44ac5f92432f50" exitCode=0 Dec 01 20:44:41 crc kubenswrapper[4802]: I1201 20:44:41.423145 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77k4n" event={"ID":"e551ade9-177c-4916-8a41-5c2efbe47e0b","Type":"ContainerDied","Data":"7a4f7f7faa61da70f1a97296c209b86a5d54987c0d6def71bd44ac5f92432f50"} Dec 01 20:44:41 crc kubenswrapper[4802]: I1201 20:44:41.423163 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77k4n" Dec 01 20:44:41 crc kubenswrapper[4802]: I1201 20:44:41.423181 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77k4n" event={"ID":"e551ade9-177c-4916-8a41-5c2efbe47e0b","Type":"ContainerDied","Data":"078a9e67bde5650950a3388a28bd2bcacdba1016298b86883d982704a2418c65"} Dec 01 20:44:41 crc kubenswrapper[4802]: I1201 20:44:41.423216 4802 scope.go:117] "RemoveContainer" containerID="7a4f7f7faa61da70f1a97296c209b86a5d54987c0d6def71bd44ac5f92432f50" Dec 01 20:44:41 crc kubenswrapper[4802]: I1201 20:44:41.442189 4802 scope.go:117] "RemoveContainer" containerID="f239c3a7778c830b651ad79b2b58ccf4551d162e21ff32765303bbb368ee695a" Dec 01 20:44:41 crc kubenswrapper[4802]: I1201 20:44:41.459318 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-77k4n"] Dec 01 20:44:41 crc kubenswrapper[4802]: I1201 20:44:41.468747 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-77k4n"] Dec 01 20:44:41 crc kubenswrapper[4802]: I1201 20:44:41.473848 4802 scope.go:117] "RemoveContainer" containerID="02610f69c253997652936332e9388cf5f74f4810063a23e9ad5c3536688281cf" Dec 01 20:44:41 crc kubenswrapper[4802]: I1201 20:44:41.504363 4802 scope.go:117] "RemoveContainer" containerID="7a4f7f7faa61da70f1a97296c209b86a5d54987c0d6def71bd44ac5f92432f50" Dec 01 20:44:41 crc kubenswrapper[4802]: E1201 20:44:41.504834 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a4f7f7faa61da70f1a97296c209b86a5d54987c0d6def71bd44ac5f92432f50\": container with ID starting with 7a4f7f7faa61da70f1a97296c209b86a5d54987c0d6def71bd44ac5f92432f50 not found: ID does not exist" containerID="7a4f7f7faa61da70f1a97296c209b86a5d54987c0d6def71bd44ac5f92432f50" Dec 01 20:44:41 crc kubenswrapper[4802]: I1201 20:44:41.504869 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a4f7f7faa61da70f1a97296c209b86a5d54987c0d6def71bd44ac5f92432f50"} err="failed to get container status \"7a4f7f7faa61da70f1a97296c209b86a5d54987c0d6def71bd44ac5f92432f50\": rpc error: code = NotFound desc = could not find container \"7a4f7f7faa61da70f1a97296c209b86a5d54987c0d6def71bd44ac5f92432f50\": container with ID starting with 7a4f7f7faa61da70f1a97296c209b86a5d54987c0d6def71bd44ac5f92432f50 not found: ID does not exist" Dec 01 20:44:41 crc kubenswrapper[4802]: I1201 20:44:41.504892 4802 scope.go:117] "RemoveContainer" containerID="f239c3a7778c830b651ad79b2b58ccf4551d162e21ff32765303bbb368ee695a" Dec 01 20:44:41 crc kubenswrapper[4802]: E1201 20:44:41.505948 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f239c3a7778c830b651ad79b2b58ccf4551d162e21ff32765303bbb368ee695a\": container with ID starting with f239c3a7778c830b651ad79b2b58ccf4551d162e21ff32765303bbb368ee695a not found: ID does not exist" containerID="f239c3a7778c830b651ad79b2b58ccf4551d162e21ff32765303bbb368ee695a" Dec 01 20:44:41 crc kubenswrapper[4802]: I1201 20:44:41.505982 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f239c3a7778c830b651ad79b2b58ccf4551d162e21ff32765303bbb368ee695a"} err="failed to get container status \"f239c3a7778c830b651ad79b2b58ccf4551d162e21ff32765303bbb368ee695a\": rpc error: code = NotFound desc = could not find container \"f239c3a7778c830b651ad79b2b58ccf4551d162e21ff32765303bbb368ee695a\": container with ID starting with f239c3a7778c830b651ad79b2b58ccf4551d162e21ff32765303bbb368ee695a not found: ID does not exist" Dec 01 20:44:41 crc kubenswrapper[4802]: I1201 20:44:41.506002 4802 scope.go:117] "RemoveContainer" containerID="02610f69c253997652936332e9388cf5f74f4810063a23e9ad5c3536688281cf" Dec 01 20:44:41 crc kubenswrapper[4802]: E1201 20:44:41.506352 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02610f69c253997652936332e9388cf5f74f4810063a23e9ad5c3536688281cf\": container with ID starting with 02610f69c253997652936332e9388cf5f74f4810063a23e9ad5c3536688281cf not found: ID does not exist" containerID="02610f69c253997652936332e9388cf5f74f4810063a23e9ad5c3536688281cf" Dec 01 20:44:41 crc kubenswrapper[4802]: I1201 20:44:41.506380 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02610f69c253997652936332e9388cf5f74f4810063a23e9ad5c3536688281cf"} err="failed to get container status \"02610f69c253997652936332e9388cf5f74f4810063a23e9ad5c3536688281cf\": rpc error: code = NotFound desc = could not find container \"02610f69c253997652936332e9388cf5f74f4810063a23e9ad5c3536688281cf\": container with ID starting with 02610f69c253997652936332e9388cf5f74f4810063a23e9ad5c3536688281cf not found: ID does not exist" Dec 01 20:44:42 crc kubenswrapper[4802]: I1201 20:44:42.738152 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e551ade9-177c-4916-8a41-5c2efbe47e0b" path="/var/lib/kubelet/pods/e551ade9-177c-4916-8a41-5c2efbe47e0b/volumes" Dec 01 20:44:58 crc kubenswrapper[4802]: I1201 20:44:58.088794 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:44:58 crc kubenswrapper[4802]: I1201 20:44:58.089294 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:44:58 crc kubenswrapper[4802]: I1201 20:44:58.089329 4802 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 20:44:58 crc kubenswrapper[4802]: I1201 20:44:58.089985 4802 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098"} pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 20:44:58 crc kubenswrapper[4802]: I1201 20:44:58.090047 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" containerID="cri-o://538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098" gracePeriod=600 Dec 01 20:44:58 crc kubenswrapper[4802]: E1201 20:44:58.222171 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:44:58 crc kubenswrapper[4802]: I1201 20:44:58.583190 4802 generic.go:334] "Generic (PLEG): container finished" podID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerID="538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098" exitCode=0 Dec 01 20:44:58 crc kubenswrapper[4802]: I1201 20:44:58.583253 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerDied","Data":"538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098"} Dec 01 20:44:58 crc kubenswrapper[4802]: I1201 20:44:58.583312 4802 scope.go:117] "RemoveContainer" containerID="4b52124a617ff9aff5ae5f96828f2e1b5ca32a7fd19dd6889c883a07f00e9bf3" Dec 01 20:44:58 crc kubenswrapper[4802]: I1201 20:44:58.583927 4802 scope.go:117] "RemoveContainer" containerID="538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098" Dec 01 20:44:58 crc kubenswrapper[4802]: E1201 20:44:58.584305 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:45:00 crc kubenswrapper[4802]: I1201 20:45:00.148557 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410365-px96f"] Dec 01 20:45:00 crc kubenswrapper[4802]: E1201 20:45:00.149356 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e551ade9-177c-4916-8a41-5c2efbe47e0b" containerName="extract-utilities" Dec 01 20:45:00 crc kubenswrapper[4802]: I1201 20:45:00.149375 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="e551ade9-177c-4916-8a41-5c2efbe47e0b" containerName="extract-utilities" Dec 01 20:45:00 crc kubenswrapper[4802]: E1201 20:45:00.149403 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e551ade9-177c-4916-8a41-5c2efbe47e0b" containerName="extract-content" Dec 01 20:45:00 crc kubenswrapper[4802]: I1201 20:45:00.149411 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="e551ade9-177c-4916-8a41-5c2efbe47e0b" containerName="extract-content" Dec 01 20:45:00 crc kubenswrapper[4802]: E1201 20:45:00.149446 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e551ade9-177c-4916-8a41-5c2efbe47e0b" containerName="registry-server" Dec 01 20:45:00 crc kubenswrapper[4802]: I1201 20:45:00.149457 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="e551ade9-177c-4916-8a41-5c2efbe47e0b" containerName="registry-server" Dec 01 20:45:00 crc kubenswrapper[4802]: I1201 20:45:00.149699 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="e551ade9-177c-4916-8a41-5c2efbe47e0b" containerName="registry-server" Dec 01 20:45:00 crc kubenswrapper[4802]: I1201 20:45:00.150468 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410365-px96f" Dec 01 20:45:00 crc kubenswrapper[4802]: I1201 20:45:00.152385 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 20:45:00 crc kubenswrapper[4802]: I1201 20:45:00.153297 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 20:45:00 crc kubenswrapper[4802]: I1201 20:45:00.158924 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410365-px96f"] Dec 01 20:45:00 crc kubenswrapper[4802]: I1201 20:45:00.203098 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfjzx\" (UniqueName: \"kubernetes.io/projected/771813aa-da4b-4a5f-b014-6b5ca3e34c19-kube-api-access-wfjzx\") pod \"collect-profiles-29410365-px96f\" (UID: \"771813aa-da4b-4a5f-b014-6b5ca3e34c19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410365-px96f" Dec 01 20:45:00 crc kubenswrapper[4802]: I1201 20:45:00.203158 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/771813aa-da4b-4a5f-b014-6b5ca3e34c19-config-volume\") pod \"collect-profiles-29410365-px96f\" (UID: \"771813aa-da4b-4a5f-b014-6b5ca3e34c19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410365-px96f" Dec 01 20:45:00 crc kubenswrapper[4802]: I1201 20:45:00.203394 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/771813aa-da4b-4a5f-b014-6b5ca3e34c19-secret-volume\") pod \"collect-profiles-29410365-px96f\" (UID: \"771813aa-da4b-4a5f-b014-6b5ca3e34c19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410365-px96f" Dec 01 20:45:00 crc kubenswrapper[4802]: I1201 20:45:00.304744 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfjzx\" (UniqueName: \"kubernetes.io/projected/771813aa-da4b-4a5f-b014-6b5ca3e34c19-kube-api-access-wfjzx\") pod \"collect-profiles-29410365-px96f\" (UID: \"771813aa-da4b-4a5f-b014-6b5ca3e34c19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410365-px96f" Dec 01 20:45:00 crc kubenswrapper[4802]: I1201 20:45:00.304782 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/771813aa-da4b-4a5f-b014-6b5ca3e34c19-config-volume\") pod \"collect-profiles-29410365-px96f\" (UID: \"771813aa-da4b-4a5f-b014-6b5ca3e34c19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410365-px96f" Dec 01 20:45:00 crc kubenswrapper[4802]: I1201 20:45:00.304847 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/771813aa-da4b-4a5f-b014-6b5ca3e34c19-secret-volume\") pod \"collect-profiles-29410365-px96f\" (UID: \"771813aa-da4b-4a5f-b014-6b5ca3e34c19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410365-px96f" Dec 01 20:45:00 crc kubenswrapper[4802]: I1201 20:45:00.306009 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/771813aa-da4b-4a5f-b014-6b5ca3e34c19-config-volume\") pod \"collect-profiles-29410365-px96f\" (UID: \"771813aa-da4b-4a5f-b014-6b5ca3e34c19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410365-px96f" Dec 01 20:45:00 crc kubenswrapper[4802]: I1201 20:45:00.310755 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/771813aa-da4b-4a5f-b014-6b5ca3e34c19-secret-volume\") pod \"collect-profiles-29410365-px96f\" (UID: \"771813aa-da4b-4a5f-b014-6b5ca3e34c19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410365-px96f" Dec 01 20:45:00 crc kubenswrapper[4802]: I1201 20:45:00.336922 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfjzx\" (UniqueName: \"kubernetes.io/projected/771813aa-da4b-4a5f-b014-6b5ca3e34c19-kube-api-access-wfjzx\") pod \"collect-profiles-29410365-px96f\" (UID: \"771813aa-da4b-4a5f-b014-6b5ca3e34c19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410365-px96f" Dec 01 20:45:00 crc kubenswrapper[4802]: I1201 20:45:00.486566 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410365-px96f" Dec 01 20:45:00 crc kubenswrapper[4802]: I1201 20:45:00.924438 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410365-px96f"] Dec 01 20:45:01 crc kubenswrapper[4802]: I1201 20:45:01.616912 4802 generic.go:334] "Generic (PLEG): container finished" podID="771813aa-da4b-4a5f-b014-6b5ca3e34c19" containerID="0c4ab07dcfa5ee05c03bf13de5987635f321dc736e477bafb450ea6f3193842d" exitCode=0 Dec 01 20:45:01 crc kubenswrapper[4802]: I1201 20:45:01.617032 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410365-px96f" event={"ID":"771813aa-da4b-4a5f-b014-6b5ca3e34c19","Type":"ContainerDied","Data":"0c4ab07dcfa5ee05c03bf13de5987635f321dc736e477bafb450ea6f3193842d"} Dec 01 20:45:01 crc kubenswrapper[4802]: I1201 20:45:01.618220 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410365-px96f" event={"ID":"771813aa-da4b-4a5f-b014-6b5ca3e34c19","Type":"ContainerStarted","Data":"dacb82c6273ecbc66c8e6cfe236733548ecd1e0d9bf95df13b171a1e2c5811dc"} Dec 01 20:45:02 crc kubenswrapper[4802]: I1201 20:45:02.914869 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410365-px96f" Dec 01 20:45:02 crc kubenswrapper[4802]: I1201 20:45:02.955577 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/771813aa-da4b-4a5f-b014-6b5ca3e34c19-secret-volume\") pod \"771813aa-da4b-4a5f-b014-6b5ca3e34c19\" (UID: \"771813aa-da4b-4a5f-b014-6b5ca3e34c19\") " Dec 01 20:45:02 crc kubenswrapper[4802]: I1201 20:45:02.955657 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/771813aa-da4b-4a5f-b014-6b5ca3e34c19-config-volume\") pod \"771813aa-da4b-4a5f-b014-6b5ca3e34c19\" (UID: \"771813aa-da4b-4a5f-b014-6b5ca3e34c19\") " Dec 01 20:45:02 crc kubenswrapper[4802]: I1201 20:45:02.955698 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfjzx\" (UniqueName: \"kubernetes.io/projected/771813aa-da4b-4a5f-b014-6b5ca3e34c19-kube-api-access-wfjzx\") pod \"771813aa-da4b-4a5f-b014-6b5ca3e34c19\" (UID: \"771813aa-da4b-4a5f-b014-6b5ca3e34c19\") " Dec 01 20:45:02 crc kubenswrapper[4802]: I1201 20:45:02.956397 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/771813aa-da4b-4a5f-b014-6b5ca3e34c19-config-volume" (OuterVolumeSpecName: "config-volume") pod "771813aa-da4b-4a5f-b014-6b5ca3e34c19" (UID: "771813aa-da4b-4a5f-b014-6b5ca3e34c19"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:45:02 crc kubenswrapper[4802]: I1201 20:45:02.960996 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/771813aa-da4b-4a5f-b014-6b5ca3e34c19-kube-api-access-wfjzx" (OuterVolumeSpecName: "kube-api-access-wfjzx") pod "771813aa-da4b-4a5f-b014-6b5ca3e34c19" (UID: "771813aa-da4b-4a5f-b014-6b5ca3e34c19"). InnerVolumeSpecName "kube-api-access-wfjzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:45:02 crc kubenswrapper[4802]: I1201 20:45:02.961417 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/771813aa-da4b-4a5f-b014-6b5ca3e34c19-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "771813aa-da4b-4a5f-b014-6b5ca3e34c19" (UID: "771813aa-da4b-4a5f-b014-6b5ca3e34c19"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:45:03 crc kubenswrapper[4802]: I1201 20:45:03.057898 4802 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/771813aa-da4b-4a5f-b014-6b5ca3e34c19-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 20:45:03 crc kubenswrapper[4802]: I1201 20:45:03.057947 4802 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/771813aa-da4b-4a5f-b014-6b5ca3e34c19-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 20:45:03 crc kubenswrapper[4802]: I1201 20:45:03.057960 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfjzx\" (UniqueName: \"kubernetes.io/projected/771813aa-da4b-4a5f-b014-6b5ca3e34c19-kube-api-access-wfjzx\") on node \"crc\" DevicePath \"\"" Dec 01 20:45:03 crc kubenswrapper[4802]: I1201 20:45:03.634993 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410365-px96f" event={"ID":"771813aa-da4b-4a5f-b014-6b5ca3e34c19","Type":"ContainerDied","Data":"dacb82c6273ecbc66c8e6cfe236733548ecd1e0d9bf95df13b171a1e2c5811dc"} Dec 01 20:45:03 crc kubenswrapper[4802]: I1201 20:45:03.635028 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dacb82c6273ecbc66c8e6cfe236733548ecd1e0d9bf95df13b171a1e2c5811dc" Dec 01 20:45:03 crc kubenswrapper[4802]: I1201 20:45:03.635054 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410365-px96f" Dec 01 20:45:04 crc kubenswrapper[4802]: I1201 20:45:03.999961 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410320-ptlrb"] Dec 01 20:45:04 crc kubenswrapper[4802]: I1201 20:45:04.008951 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410320-ptlrb"] Dec 01 20:45:04 crc kubenswrapper[4802]: I1201 20:45:04.730092 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c376edc-1f4d-4651-ad0c-cdb7b1412a6c" path="/var/lib/kubelet/pods/7c376edc-1f4d-4651-ad0c-cdb7b1412a6c/volumes" Dec 01 20:45:12 crc kubenswrapper[4802]: I1201 20:45:12.720146 4802 scope.go:117] "RemoveContainer" containerID="538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098" Dec 01 20:45:12 crc kubenswrapper[4802]: E1201 20:45:12.722056 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:45:23 crc kubenswrapper[4802]: I1201 20:45:23.720967 4802 scope.go:117] "RemoveContainer" containerID="538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098" Dec 01 20:45:23 crc kubenswrapper[4802]: E1201 20:45:23.721785 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:45:26 crc kubenswrapper[4802]: I1201 20:45:26.894620 4802 scope.go:117] "RemoveContainer" containerID="5ccf237f66eb18a6d81651e774b4abc173c2f7cefc2728ef2947d6a60d97b7ca" Dec 01 20:45:36 crc kubenswrapper[4802]: I1201 20:45:36.720712 4802 scope.go:117] "RemoveContainer" containerID="538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098" Dec 01 20:45:36 crc kubenswrapper[4802]: E1201 20:45:36.721546 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:45:50 crc kubenswrapper[4802]: I1201 20:45:50.721109 4802 scope.go:117] "RemoveContainer" containerID="538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098" Dec 01 20:45:50 crc kubenswrapper[4802]: E1201 20:45:50.722122 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:45:59 crc kubenswrapper[4802]: I1201 20:45:59.093993 4802 generic.go:334] "Generic (PLEG): container finished" podID="bad07c68-596f-44ca-9580-335176bd8049" containerID="b432d8810475e34225b9269ed0a642d90b28dd8c601b51cf0c4d6e07cd994535" exitCode=0 Dec 01 20:45:59 crc kubenswrapper[4802]: I1201 20:45:59.095808 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc" event={"ID":"bad07c68-596f-44ca-9580-335176bd8049","Type":"ContainerDied","Data":"b432d8810475e34225b9269ed0a642d90b28dd8c601b51cf0c4d6e07cd994535"} Dec 01 20:46:00 crc kubenswrapper[4802]: I1201 20:46:00.529668 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc" Dec 01 20:46:00 crc kubenswrapper[4802]: I1201 20:46:00.648166 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-libvirt-combined-ca-bundle\") pod \"bad07c68-596f-44ca-9580-335176bd8049\" (UID: \"bad07c68-596f-44ca-9580-335176bd8049\") " Dec 01 20:46:00 crc kubenswrapper[4802]: I1201 20:46:00.648237 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcvv6\" (UniqueName: \"kubernetes.io/projected/bad07c68-596f-44ca-9580-335176bd8049-kube-api-access-rcvv6\") pod \"bad07c68-596f-44ca-9580-335176bd8049\" (UID: \"bad07c68-596f-44ca-9580-335176bd8049\") " Dec 01 20:46:00 crc kubenswrapper[4802]: I1201 20:46:00.648285 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-inventory\") pod \"bad07c68-596f-44ca-9580-335176bd8049\" (UID: \"bad07c68-596f-44ca-9580-335176bd8049\") " Dec 01 20:46:00 crc kubenswrapper[4802]: I1201 20:46:00.648482 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-ssh-key\") pod \"bad07c68-596f-44ca-9580-335176bd8049\" (UID: \"bad07c68-596f-44ca-9580-335176bd8049\") " Dec 01 20:46:00 crc kubenswrapper[4802]: I1201 20:46:00.648505 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-libvirt-secret-0\") pod \"bad07c68-596f-44ca-9580-335176bd8049\" (UID: \"bad07c68-596f-44ca-9580-335176bd8049\") " Dec 01 20:46:00 crc kubenswrapper[4802]: I1201 20:46:00.648529 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-ceph\") pod \"bad07c68-596f-44ca-9580-335176bd8049\" (UID: \"bad07c68-596f-44ca-9580-335176bd8049\") " Dec 01 20:46:00 crc kubenswrapper[4802]: I1201 20:46:00.655413 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bad07c68-596f-44ca-9580-335176bd8049-kube-api-access-rcvv6" (OuterVolumeSpecName: "kube-api-access-rcvv6") pod "bad07c68-596f-44ca-9580-335176bd8049" (UID: "bad07c68-596f-44ca-9580-335176bd8049"). InnerVolumeSpecName "kube-api-access-rcvv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:46:00 crc kubenswrapper[4802]: I1201 20:46:00.657291 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-ceph" (OuterVolumeSpecName: "ceph") pod "bad07c68-596f-44ca-9580-335176bd8049" (UID: "bad07c68-596f-44ca-9580-335176bd8049"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:46:00 crc kubenswrapper[4802]: I1201 20:46:00.657574 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "bad07c68-596f-44ca-9580-335176bd8049" (UID: "bad07c68-596f-44ca-9580-335176bd8049"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:46:00 crc kubenswrapper[4802]: I1201 20:46:00.681914 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bad07c68-596f-44ca-9580-335176bd8049" (UID: "bad07c68-596f-44ca-9580-335176bd8049"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:46:00 crc kubenswrapper[4802]: I1201 20:46:00.688383 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "bad07c68-596f-44ca-9580-335176bd8049" (UID: "bad07c68-596f-44ca-9580-335176bd8049"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:46:00 crc kubenswrapper[4802]: I1201 20:46:00.695432 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-inventory" (OuterVolumeSpecName: "inventory") pod "bad07c68-596f-44ca-9580-335176bd8049" (UID: "bad07c68-596f-44ca-9580-335176bd8049"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:46:00 crc kubenswrapper[4802]: I1201 20:46:00.751045 4802 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 20:46:00 crc kubenswrapper[4802]: I1201 20:46:00.751388 4802 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 01 20:46:00 crc kubenswrapper[4802]: I1201 20:46:00.751402 4802 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 20:46:00 crc kubenswrapper[4802]: I1201 20:46:00.751412 4802 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:46:00 crc kubenswrapper[4802]: I1201 20:46:00.751423 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcvv6\" (UniqueName: \"kubernetes.io/projected/bad07c68-596f-44ca-9580-335176bd8049-kube-api-access-rcvv6\") on node \"crc\" DevicePath \"\"" Dec 01 20:46:00 crc kubenswrapper[4802]: I1201 20:46:00.751434 4802 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bad07c68-596f-44ca-9580-335176bd8049-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.111121 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc" event={"ID":"bad07c68-596f-44ca-9580-335176bd8049","Type":"ContainerDied","Data":"5a2e4b55be795781ec1ad5b73cec87a3af12d5cbc101d0fb91ebbd9cbe253289"} Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.111168 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a2e4b55be795781ec1ad5b73cec87a3af12d5cbc101d0fb91ebbd9cbe253289" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.111171 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.210242 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z"] Dec 01 20:46:01 crc kubenswrapper[4802]: E1201 20:46:01.210841 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad07c68-596f-44ca-9580-335176bd8049" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.210866 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad07c68-596f-44ca-9580-335176bd8049" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 20:46:01 crc kubenswrapper[4802]: E1201 20:46:01.210905 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="771813aa-da4b-4a5f-b014-6b5ca3e34c19" containerName="collect-profiles" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.210912 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="771813aa-da4b-4a5f-b014-6b5ca3e34c19" containerName="collect-profiles" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.211079 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="bad07c68-596f-44ca-9580-335176bd8049" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.211095 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="771813aa-da4b-4a5f-b014-6b5ca3e34c19" containerName="collect-profiles" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.213030 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.215593 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.215754 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.215836 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.215788 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.216022 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.216241 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.216307 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wx8v9" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.216472 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.217040 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.263053 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.263227 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.263275 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.263346 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.263529 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6bdn\" (UniqueName: \"kubernetes.io/projected/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-kube-api-access-x6bdn\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.263583 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.263643 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.263693 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.263797 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.263852 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.263917 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.269162 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z"] Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.365288 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.365330 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.365366 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.365413 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.366160 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.366208 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.366241 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.366291 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.366311 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6bdn\" (UniqueName: \"kubernetes.io/projected/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-kube-api-access-x6bdn\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.366378 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.366405 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.367573 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.367601 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.369213 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.369726 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.370255 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.370915 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.371458 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.372096 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.373002 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.375782 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.384372 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6bdn\" (UniqueName: \"kubernetes.io/projected/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-kube-api-access-x6bdn\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:01 crc kubenswrapper[4802]: I1201 20:46:01.537188 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:46:02 crc kubenswrapper[4802]: I1201 20:46:02.041458 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z"] Dec 01 20:46:02 crc kubenswrapper[4802]: W1201 20:46:02.047254 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e27448f_4d4a_4a61_b5c7_46fb0a3aa2a4.slice/crio-9307fdf4f57e0892e856c9708827a90f9df0b9028a32d25a1e9c6da230ef6027 WatchSource:0}: Error finding container 9307fdf4f57e0892e856c9708827a90f9df0b9028a32d25a1e9c6da230ef6027: Status 404 returned error can't find the container with id 9307fdf4f57e0892e856c9708827a90f9df0b9028a32d25a1e9c6da230ef6027 Dec 01 20:46:02 crc kubenswrapper[4802]: I1201 20:46:02.124991 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" event={"ID":"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4","Type":"ContainerStarted","Data":"9307fdf4f57e0892e856c9708827a90f9df0b9028a32d25a1e9c6da230ef6027"} Dec 01 20:46:02 crc kubenswrapper[4802]: I1201 20:46:02.721125 4802 scope.go:117] "RemoveContainer" containerID="538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098" Dec 01 20:46:02 crc kubenswrapper[4802]: E1201 20:46:02.722425 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:46:04 crc kubenswrapper[4802]: I1201 20:46:04.141852 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" event={"ID":"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4","Type":"ContainerStarted","Data":"366a9220a99f24b8608226edf7d680656210cb2429b84099bc0e71db4bc1f825"} Dec 01 20:46:04 crc kubenswrapper[4802]: I1201 20:46:04.172966 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" podStartSLOduration=2.244702337 podStartE2EDuration="3.172945075s" podCreationTimestamp="2025-12-01 20:46:01 +0000 UTC" firstStartedPulling="2025-12-01 20:46:02.050171247 +0000 UTC m=+2983.612730908" lastFinishedPulling="2025-12-01 20:46:02.978413995 +0000 UTC m=+2984.540973646" observedRunningTime="2025-12-01 20:46:04.160135934 +0000 UTC m=+2985.722695575" watchObservedRunningTime="2025-12-01 20:46:04.172945075 +0000 UTC m=+2985.735504726" Dec 01 20:46:17 crc kubenswrapper[4802]: I1201 20:46:17.720761 4802 scope.go:117] "RemoveContainer" containerID="538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098" Dec 01 20:46:17 crc kubenswrapper[4802]: E1201 20:46:17.721662 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:46:32 crc kubenswrapper[4802]: I1201 20:46:32.720620 4802 scope.go:117] "RemoveContainer" containerID="538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098" Dec 01 20:46:32 crc kubenswrapper[4802]: E1201 20:46:32.721572 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:46:45 crc kubenswrapper[4802]: I1201 20:46:45.719731 4802 scope.go:117] "RemoveContainer" containerID="538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098" Dec 01 20:46:45 crc kubenswrapper[4802]: E1201 20:46:45.720715 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:46:56 crc kubenswrapper[4802]: I1201 20:46:56.720551 4802 scope.go:117] "RemoveContainer" containerID="538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098" Dec 01 20:46:56 crc kubenswrapper[4802]: E1201 20:46:56.721275 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:47:08 crc kubenswrapper[4802]: I1201 20:47:08.729060 4802 scope.go:117] "RemoveContainer" containerID="538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098" Dec 01 20:47:08 crc kubenswrapper[4802]: E1201 20:47:08.729989 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:47:19 crc kubenswrapper[4802]: I1201 20:47:19.720977 4802 scope.go:117] "RemoveContainer" containerID="538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098" Dec 01 20:47:19 crc kubenswrapper[4802]: E1201 20:47:19.721845 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:47:33 crc kubenswrapper[4802]: I1201 20:47:33.720543 4802 scope.go:117] "RemoveContainer" containerID="538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098" Dec 01 20:47:33 crc kubenswrapper[4802]: E1201 20:47:33.721232 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:47:47 crc kubenswrapper[4802]: I1201 20:47:47.719994 4802 scope.go:117] "RemoveContainer" containerID="538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098" Dec 01 20:47:47 crc kubenswrapper[4802]: E1201 20:47:47.720796 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:47:59 crc kubenswrapper[4802]: I1201 20:47:59.720760 4802 scope.go:117] "RemoveContainer" containerID="538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098" Dec 01 20:47:59 crc kubenswrapper[4802]: E1201 20:47:59.721573 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:48:12 crc kubenswrapper[4802]: I1201 20:48:12.981838 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-c2jsc"] Dec 01 20:48:12 crc kubenswrapper[4802]: I1201 20:48:12.985353 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c2jsc" Dec 01 20:48:12 crc kubenswrapper[4802]: I1201 20:48:12.998342 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c2jsc"] Dec 01 20:48:13 crc kubenswrapper[4802]: I1201 20:48:13.063808 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e7d5371-a3c5-4253-b8ed-a46478bc00f0-catalog-content\") pod \"community-operators-c2jsc\" (UID: \"3e7d5371-a3c5-4253-b8ed-a46478bc00f0\") " pod="openshift-marketplace/community-operators-c2jsc" Dec 01 20:48:13 crc kubenswrapper[4802]: I1201 20:48:13.063965 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckd7c\" (UniqueName: \"kubernetes.io/projected/3e7d5371-a3c5-4253-b8ed-a46478bc00f0-kube-api-access-ckd7c\") pod \"community-operators-c2jsc\" (UID: \"3e7d5371-a3c5-4253-b8ed-a46478bc00f0\") " pod="openshift-marketplace/community-operators-c2jsc" Dec 01 20:48:13 crc kubenswrapper[4802]: I1201 20:48:13.064033 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e7d5371-a3c5-4253-b8ed-a46478bc00f0-utilities\") pod \"community-operators-c2jsc\" (UID: \"3e7d5371-a3c5-4253-b8ed-a46478bc00f0\") " pod="openshift-marketplace/community-operators-c2jsc" Dec 01 20:48:13 crc kubenswrapper[4802]: I1201 20:48:13.166234 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e7d5371-a3c5-4253-b8ed-a46478bc00f0-catalog-content\") pod \"community-operators-c2jsc\" (UID: \"3e7d5371-a3c5-4253-b8ed-a46478bc00f0\") " pod="openshift-marketplace/community-operators-c2jsc" Dec 01 20:48:13 crc kubenswrapper[4802]: I1201 20:48:13.166322 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckd7c\" (UniqueName: \"kubernetes.io/projected/3e7d5371-a3c5-4253-b8ed-a46478bc00f0-kube-api-access-ckd7c\") pod \"community-operators-c2jsc\" (UID: \"3e7d5371-a3c5-4253-b8ed-a46478bc00f0\") " pod="openshift-marketplace/community-operators-c2jsc" Dec 01 20:48:13 crc kubenswrapper[4802]: I1201 20:48:13.166369 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e7d5371-a3c5-4253-b8ed-a46478bc00f0-utilities\") pod \"community-operators-c2jsc\" (UID: \"3e7d5371-a3c5-4253-b8ed-a46478bc00f0\") " pod="openshift-marketplace/community-operators-c2jsc" Dec 01 20:48:13 crc kubenswrapper[4802]: I1201 20:48:13.166898 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e7d5371-a3c5-4253-b8ed-a46478bc00f0-utilities\") pod \"community-operators-c2jsc\" (UID: \"3e7d5371-a3c5-4253-b8ed-a46478bc00f0\") " pod="openshift-marketplace/community-operators-c2jsc" Dec 01 20:48:13 crc kubenswrapper[4802]: I1201 20:48:13.167061 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e7d5371-a3c5-4253-b8ed-a46478bc00f0-catalog-content\") pod \"community-operators-c2jsc\" (UID: \"3e7d5371-a3c5-4253-b8ed-a46478bc00f0\") " pod="openshift-marketplace/community-operators-c2jsc" Dec 01 20:48:13 crc kubenswrapper[4802]: I1201 20:48:13.190789 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckd7c\" (UniqueName: \"kubernetes.io/projected/3e7d5371-a3c5-4253-b8ed-a46478bc00f0-kube-api-access-ckd7c\") pod \"community-operators-c2jsc\" (UID: \"3e7d5371-a3c5-4253-b8ed-a46478bc00f0\") " pod="openshift-marketplace/community-operators-c2jsc" Dec 01 20:48:13 crc kubenswrapper[4802]: I1201 20:48:13.311799 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c2jsc" Dec 01 20:48:13 crc kubenswrapper[4802]: I1201 20:48:13.719905 4802 scope.go:117] "RemoveContainer" containerID="538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098" Dec 01 20:48:13 crc kubenswrapper[4802]: E1201 20:48:13.720495 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:48:13 crc kubenswrapper[4802]: I1201 20:48:13.809987 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c2jsc"] Dec 01 20:48:14 crc kubenswrapper[4802]: I1201 20:48:14.380893 4802 generic.go:334] "Generic (PLEG): container finished" podID="3e7d5371-a3c5-4253-b8ed-a46478bc00f0" containerID="5d3438ee3d6e12264658fbbe9edc7346aa67e02d154a0a35966675ef73c06cc8" exitCode=0 Dec 01 20:48:14 crc kubenswrapper[4802]: I1201 20:48:14.381177 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c2jsc" event={"ID":"3e7d5371-a3c5-4253-b8ed-a46478bc00f0","Type":"ContainerDied","Data":"5d3438ee3d6e12264658fbbe9edc7346aa67e02d154a0a35966675ef73c06cc8"} Dec 01 20:48:14 crc kubenswrapper[4802]: I1201 20:48:14.381228 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c2jsc" event={"ID":"3e7d5371-a3c5-4253-b8ed-a46478bc00f0","Type":"ContainerStarted","Data":"1d63f7896ecc009e7a6849b201e72aee478ab189884647b9e8d9795d592d768e"} Dec 01 20:48:14 crc kubenswrapper[4802]: I1201 20:48:14.384548 4802 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 20:48:15 crc kubenswrapper[4802]: I1201 20:48:15.391749 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c2jsc" event={"ID":"3e7d5371-a3c5-4253-b8ed-a46478bc00f0","Type":"ContainerStarted","Data":"d66f4c3e89c106d5420f50688218a82534d454260d6df4e94989e6791f2cd401"} Dec 01 20:48:16 crc kubenswrapper[4802]: I1201 20:48:16.403246 4802 generic.go:334] "Generic (PLEG): container finished" podID="3e7d5371-a3c5-4253-b8ed-a46478bc00f0" containerID="d66f4c3e89c106d5420f50688218a82534d454260d6df4e94989e6791f2cd401" exitCode=0 Dec 01 20:48:16 crc kubenswrapper[4802]: I1201 20:48:16.403303 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c2jsc" event={"ID":"3e7d5371-a3c5-4253-b8ed-a46478bc00f0","Type":"ContainerDied","Data":"d66f4c3e89c106d5420f50688218a82534d454260d6df4e94989e6791f2cd401"} Dec 01 20:48:17 crc kubenswrapper[4802]: I1201 20:48:17.416503 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c2jsc" event={"ID":"3e7d5371-a3c5-4253-b8ed-a46478bc00f0","Type":"ContainerStarted","Data":"f38f09f0f4f620bb7325c00580166cafbc0d0cb1c96f718affc3b5b328b7f938"} Dec 01 20:48:17 crc kubenswrapper[4802]: I1201 20:48:17.440302 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-c2jsc" podStartSLOduration=2.993701872 podStartE2EDuration="5.440284874s" podCreationTimestamp="2025-12-01 20:48:12 +0000 UTC" firstStartedPulling="2025-12-01 20:48:14.384364463 +0000 UTC m=+3115.946924104" lastFinishedPulling="2025-12-01 20:48:16.830947465 +0000 UTC m=+3118.393507106" observedRunningTime="2025-12-01 20:48:17.43309575 +0000 UTC m=+3118.995655391" watchObservedRunningTime="2025-12-01 20:48:17.440284874 +0000 UTC m=+3119.002844515" Dec 01 20:48:23 crc kubenswrapper[4802]: I1201 20:48:23.312632 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-c2jsc" Dec 01 20:48:23 crc kubenswrapper[4802]: I1201 20:48:23.313226 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-c2jsc" Dec 01 20:48:23 crc kubenswrapper[4802]: I1201 20:48:23.384971 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-c2jsc" Dec 01 20:48:23 crc kubenswrapper[4802]: I1201 20:48:23.507226 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-c2jsc" Dec 01 20:48:23 crc kubenswrapper[4802]: I1201 20:48:23.617113 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c2jsc"] Dec 01 20:48:25 crc kubenswrapper[4802]: I1201 20:48:25.480782 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-c2jsc" podUID="3e7d5371-a3c5-4253-b8ed-a46478bc00f0" containerName="registry-server" containerID="cri-o://f38f09f0f4f620bb7325c00580166cafbc0d0cb1c96f718affc3b5b328b7f938" gracePeriod=2 Dec 01 20:48:26 crc kubenswrapper[4802]: I1201 20:48:26.166798 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c2jsc" Dec 01 20:48:26 crc kubenswrapper[4802]: I1201 20:48:26.308651 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckd7c\" (UniqueName: \"kubernetes.io/projected/3e7d5371-a3c5-4253-b8ed-a46478bc00f0-kube-api-access-ckd7c\") pod \"3e7d5371-a3c5-4253-b8ed-a46478bc00f0\" (UID: \"3e7d5371-a3c5-4253-b8ed-a46478bc00f0\") " Dec 01 20:48:26 crc kubenswrapper[4802]: I1201 20:48:26.308805 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e7d5371-a3c5-4253-b8ed-a46478bc00f0-catalog-content\") pod \"3e7d5371-a3c5-4253-b8ed-a46478bc00f0\" (UID: \"3e7d5371-a3c5-4253-b8ed-a46478bc00f0\") " Dec 01 20:48:26 crc kubenswrapper[4802]: I1201 20:48:26.308888 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e7d5371-a3c5-4253-b8ed-a46478bc00f0-utilities\") pod \"3e7d5371-a3c5-4253-b8ed-a46478bc00f0\" (UID: \"3e7d5371-a3c5-4253-b8ed-a46478bc00f0\") " Dec 01 20:48:26 crc kubenswrapper[4802]: I1201 20:48:26.309667 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e7d5371-a3c5-4253-b8ed-a46478bc00f0-utilities" (OuterVolumeSpecName: "utilities") pod "3e7d5371-a3c5-4253-b8ed-a46478bc00f0" (UID: "3e7d5371-a3c5-4253-b8ed-a46478bc00f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:48:26 crc kubenswrapper[4802]: I1201 20:48:26.314673 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e7d5371-a3c5-4253-b8ed-a46478bc00f0-kube-api-access-ckd7c" (OuterVolumeSpecName: "kube-api-access-ckd7c") pod "3e7d5371-a3c5-4253-b8ed-a46478bc00f0" (UID: "3e7d5371-a3c5-4253-b8ed-a46478bc00f0"). InnerVolumeSpecName "kube-api-access-ckd7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:48:26 crc kubenswrapper[4802]: I1201 20:48:26.364009 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e7d5371-a3c5-4253-b8ed-a46478bc00f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e7d5371-a3c5-4253-b8ed-a46478bc00f0" (UID: "3e7d5371-a3c5-4253-b8ed-a46478bc00f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:48:26 crc kubenswrapper[4802]: I1201 20:48:26.411597 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e7d5371-a3c5-4253-b8ed-a46478bc00f0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 20:48:26 crc kubenswrapper[4802]: I1201 20:48:26.411664 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e7d5371-a3c5-4253-b8ed-a46478bc00f0-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 20:48:26 crc kubenswrapper[4802]: I1201 20:48:26.411676 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckd7c\" (UniqueName: \"kubernetes.io/projected/3e7d5371-a3c5-4253-b8ed-a46478bc00f0-kube-api-access-ckd7c\") on node \"crc\" DevicePath \"\"" Dec 01 20:48:26 crc kubenswrapper[4802]: I1201 20:48:26.492143 4802 generic.go:334] "Generic (PLEG): container finished" podID="3e7d5371-a3c5-4253-b8ed-a46478bc00f0" containerID="f38f09f0f4f620bb7325c00580166cafbc0d0cb1c96f718affc3b5b328b7f938" exitCode=0 Dec 01 20:48:26 crc kubenswrapper[4802]: I1201 20:48:26.492184 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c2jsc" event={"ID":"3e7d5371-a3c5-4253-b8ed-a46478bc00f0","Type":"ContainerDied","Data":"f38f09f0f4f620bb7325c00580166cafbc0d0cb1c96f718affc3b5b328b7f938"} Dec 01 20:48:26 crc kubenswrapper[4802]: I1201 20:48:26.492225 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c2jsc" event={"ID":"3e7d5371-a3c5-4253-b8ed-a46478bc00f0","Type":"ContainerDied","Data":"1d63f7896ecc009e7a6849b201e72aee478ab189884647b9e8d9795d592d768e"} Dec 01 20:48:26 crc kubenswrapper[4802]: I1201 20:48:26.492241 4802 scope.go:117] "RemoveContainer" containerID="f38f09f0f4f620bb7325c00580166cafbc0d0cb1c96f718affc3b5b328b7f938" Dec 01 20:48:26 crc kubenswrapper[4802]: I1201 20:48:26.492364 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c2jsc" Dec 01 20:48:26 crc kubenswrapper[4802]: I1201 20:48:26.526262 4802 scope.go:117] "RemoveContainer" containerID="d66f4c3e89c106d5420f50688218a82534d454260d6df4e94989e6791f2cd401" Dec 01 20:48:26 crc kubenswrapper[4802]: I1201 20:48:26.528601 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c2jsc"] Dec 01 20:48:26 crc kubenswrapper[4802]: I1201 20:48:26.538778 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-c2jsc"] Dec 01 20:48:26 crc kubenswrapper[4802]: I1201 20:48:26.549025 4802 scope.go:117] "RemoveContainer" containerID="5d3438ee3d6e12264658fbbe9edc7346aa67e02d154a0a35966675ef73c06cc8" Dec 01 20:48:26 crc kubenswrapper[4802]: I1201 20:48:26.584722 4802 scope.go:117] "RemoveContainer" containerID="f38f09f0f4f620bb7325c00580166cafbc0d0cb1c96f718affc3b5b328b7f938" Dec 01 20:48:26 crc kubenswrapper[4802]: E1201 20:48:26.585174 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f38f09f0f4f620bb7325c00580166cafbc0d0cb1c96f718affc3b5b328b7f938\": container with ID starting with f38f09f0f4f620bb7325c00580166cafbc0d0cb1c96f718affc3b5b328b7f938 not found: ID does not exist" containerID="f38f09f0f4f620bb7325c00580166cafbc0d0cb1c96f718affc3b5b328b7f938" Dec 01 20:48:26 crc kubenswrapper[4802]: I1201 20:48:26.585245 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f38f09f0f4f620bb7325c00580166cafbc0d0cb1c96f718affc3b5b328b7f938"} err="failed to get container status \"f38f09f0f4f620bb7325c00580166cafbc0d0cb1c96f718affc3b5b328b7f938\": rpc error: code = NotFound desc = could not find container \"f38f09f0f4f620bb7325c00580166cafbc0d0cb1c96f718affc3b5b328b7f938\": container with ID starting with f38f09f0f4f620bb7325c00580166cafbc0d0cb1c96f718affc3b5b328b7f938 not found: ID does not exist" Dec 01 20:48:26 crc kubenswrapper[4802]: I1201 20:48:26.585271 4802 scope.go:117] "RemoveContainer" containerID="d66f4c3e89c106d5420f50688218a82534d454260d6df4e94989e6791f2cd401" Dec 01 20:48:26 crc kubenswrapper[4802]: E1201 20:48:26.585792 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d66f4c3e89c106d5420f50688218a82534d454260d6df4e94989e6791f2cd401\": container with ID starting with d66f4c3e89c106d5420f50688218a82534d454260d6df4e94989e6791f2cd401 not found: ID does not exist" containerID="d66f4c3e89c106d5420f50688218a82534d454260d6df4e94989e6791f2cd401" Dec 01 20:48:26 crc kubenswrapper[4802]: I1201 20:48:26.585825 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d66f4c3e89c106d5420f50688218a82534d454260d6df4e94989e6791f2cd401"} err="failed to get container status \"d66f4c3e89c106d5420f50688218a82534d454260d6df4e94989e6791f2cd401\": rpc error: code = NotFound desc = could not find container \"d66f4c3e89c106d5420f50688218a82534d454260d6df4e94989e6791f2cd401\": container with ID starting with d66f4c3e89c106d5420f50688218a82534d454260d6df4e94989e6791f2cd401 not found: ID does not exist" Dec 01 20:48:26 crc kubenswrapper[4802]: I1201 20:48:26.585855 4802 scope.go:117] "RemoveContainer" containerID="5d3438ee3d6e12264658fbbe9edc7346aa67e02d154a0a35966675ef73c06cc8" Dec 01 20:48:26 crc kubenswrapper[4802]: E1201 20:48:26.586210 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d3438ee3d6e12264658fbbe9edc7346aa67e02d154a0a35966675ef73c06cc8\": container with ID starting with 5d3438ee3d6e12264658fbbe9edc7346aa67e02d154a0a35966675ef73c06cc8 not found: ID does not exist" containerID="5d3438ee3d6e12264658fbbe9edc7346aa67e02d154a0a35966675ef73c06cc8" Dec 01 20:48:26 crc kubenswrapper[4802]: I1201 20:48:26.586271 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d3438ee3d6e12264658fbbe9edc7346aa67e02d154a0a35966675ef73c06cc8"} err="failed to get container status \"5d3438ee3d6e12264658fbbe9edc7346aa67e02d154a0a35966675ef73c06cc8\": rpc error: code = NotFound desc = could not find container \"5d3438ee3d6e12264658fbbe9edc7346aa67e02d154a0a35966675ef73c06cc8\": container with ID starting with 5d3438ee3d6e12264658fbbe9edc7346aa67e02d154a0a35966675ef73c06cc8 not found: ID does not exist" Dec 01 20:48:26 crc kubenswrapper[4802]: I1201 20:48:26.730312 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e7d5371-a3c5-4253-b8ed-a46478bc00f0" path="/var/lib/kubelet/pods/3e7d5371-a3c5-4253-b8ed-a46478bc00f0/volumes" Dec 01 20:48:28 crc kubenswrapper[4802]: I1201 20:48:28.728484 4802 scope.go:117] "RemoveContainer" containerID="538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098" Dec 01 20:48:28 crc kubenswrapper[4802]: E1201 20:48:28.729143 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:48:41 crc kubenswrapper[4802]: I1201 20:48:41.720070 4802 scope.go:117] "RemoveContainer" containerID="538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098" Dec 01 20:48:41 crc kubenswrapper[4802]: E1201 20:48:41.720785 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:48:51 crc kubenswrapper[4802]: I1201 20:48:51.685993 4802 generic.go:334] "Generic (PLEG): container finished" podID="5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4" containerID="366a9220a99f24b8608226edf7d680656210cb2429b84099bc0e71db4bc1f825" exitCode=0 Dec 01 20:48:51 crc kubenswrapper[4802]: I1201 20:48:51.686142 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" event={"ID":"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4","Type":"ContainerDied","Data":"366a9220a99f24b8608226edf7d680656210cb2429b84099bc0e71db4bc1f825"} Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.166998 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.328147 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-extra-config-0\") pod \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.328265 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-ssh-key\") pod \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.328289 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-ceph\") pod \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.328331 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-inventory\") pod \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.328378 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-cell1-compute-config-1\") pod \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.328399 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-ceph-nova-0\") pod \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.328440 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-custom-ceph-combined-ca-bundle\") pod \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.328458 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-migration-ssh-key-1\") pod \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.328516 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-migration-ssh-key-0\") pod \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.328532 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6bdn\" (UniqueName: \"kubernetes.io/projected/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-kube-api-access-x6bdn\") pod \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.328567 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-cell1-compute-config-0\") pod \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\" (UID: \"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4\") " Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.345710 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-ceph" (OuterVolumeSpecName: "ceph") pod "5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4" (UID: "5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.345738 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-kube-api-access-x6bdn" (OuterVolumeSpecName: "kube-api-access-x6bdn") pod "5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4" (UID: "5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4"). InnerVolumeSpecName "kube-api-access-x6bdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.345781 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4" (UID: "5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.359917 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4" (UID: "5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.363812 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4" (UID: "5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.369350 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4" (UID: "5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.371853 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4" (UID: "5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.375172 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4" (UID: "5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.376900 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-inventory" (OuterVolumeSpecName: "inventory") pod "5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4" (UID: "5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.383728 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4" (UID: "5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.391979 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4" (UID: "5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.430979 4802 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.431020 4802 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.431034 4802 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.431047 4802 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.431062 4802 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.431075 4802 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.431087 4802 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.431098 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6bdn\" (UniqueName: \"kubernetes.io/projected/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-kube-api-access-x6bdn\") on node \"crc\" DevicePath \"\"" Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.431135 4802 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.431147 4802 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.431157 4802 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.738719 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" event={"ID":"5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4","Type":"ContainerDied","Data":"9307fdf4f57e0892e856c9708827a90f9df0b9028a32d25a1e9c6da230ef6027"} Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.738988 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9307fdf4f57e0892e856c9708827a90f9df0b9028a32d25a1e9c6da230ef6027" Dec 01 20:48:53 crc kubenswrapper[4802]: I1201 20:48:53.738803 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z" Dec 01 20:48:56 crc kubenswrapper[4802]: I1201 20:48:56.720273 4802 scope.go:117] "RemoveContainer" containerID="538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098" Dec 01 20:48:56 crc kubenswrapper[4802]: E1201 20:48:56.720803 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.022214 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 01 20:49:08 crc kubenswrapper[4802]: E1201 20:49:08.023188 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e7d5371-a3c5-4253-b8ed-a46478bc00f0" containerName="extract-utilities" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.023226 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e7d5371-a3c5-4253-b8ed-a46478bc00f0" containerName="extract-utilities" Dec 01 20:49:08 crc kubenswrapper[4802]: E1201 20:49:08.023238 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e7d5371-a3c5-4253-b8ed-a46478bc00f0" containerName="extract-content" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.023246 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e7d5371-a3c5-4253-b8ed-a46478bc00f0" containerName="extract-content" Dec 01 20:49:08 crc kubenswrapper[4802]: E1201 20:49:08.023255 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e7d5371-a3c5-4253-b8ed-a46478bc00f0" containerName="registry-server" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.023263 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e7d5371-a3c5-4253-b8ed-a46478bc00f0" containerName="registry-server" Dec 01 20:49:08 crc kubenswrapper[4802]: E1201 20:49:08.023297 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.023307 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.023528 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.023548 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e7d5371-a3c5-4253-b8ed-a46478bc00f0" containerName="registry-server" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.024727 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.031383 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.031659 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.032927 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.034491 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.038073 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.046733 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.054106 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.090753 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83603474-dc08-4ea8-a158-cba205dab6da-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.090811 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83603474-dc08-4ea8-a158-cba205dab6da-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.090860 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83603474-dc08-4ea8-a158-cba205dab6da-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.090906 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.090925 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83603474-dc08-4ea8-a158-cba205dab6da-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.090944 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.091006 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.091024 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kdzw\" (UniqueName: \"kubernetes.io/projected/83603474-dc08-4ea8-a158-cba205dab6da-kube-api-access-7kdzw\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.091152 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.091251 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-dev\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.091294 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-sys\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.091409 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.091437 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.091474 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.091550 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/83603474-dc08-4ea8-a158-cba205dab6da-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.091653 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-run\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.193629 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-config-data\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.193673 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-lib-modules\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.193721 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.193773 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.193807 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-dev\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.193868 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-sys\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.193895 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-sys\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.193917 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-dev\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.193962 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-sys\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.193980 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-dev\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.194043 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.194064 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.194111 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.194123 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.194161 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.194177 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/83603474-dc08-4ea8-a158-cba205dab6da-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.194231 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-run\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.194265 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.194290 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.194314 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83603474-dc08-4ea8-a158-cba205dab6da-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.194341 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83603474-dc08-4ea8-a158-cba205dab6da-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.194361 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83603474-dc08-4ea8-a158-cba205dab6da-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.194389 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-run\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.194407 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.194414 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.194443 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.195088 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-run\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.195349 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.195765 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.195800 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-config-data-custom\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.195832 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.195852 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83603474-dc08-4ea8-a158-cba205dab6da-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.195884 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn5ws\" (UniqueName: \"kubernetes.io/projected/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-kube-api-access-nn5ws\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.195928 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.195971 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-scripts\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.195998 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-etc-nvme\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.196051 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.196085 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kdzw\" (UniqueName: \"kubernetes.io/projected/83603474-dc08-4ea8-a158-cba205dab6da-kube-api-access-7kdzw\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.196111 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-ceph\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.196341 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.199851 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/83603474-dc08-4ea8-a158-cba205dab6da-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.200021 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83603474-dc08-4ea8-a158-cba205dab6da-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.200051 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.200152 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83603474-dc08-4ea8-a158-cba205dab6da-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.200178 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/83603474-dc08-4ea8-a158-cba205dab6da-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.200536 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/83603474-dc08-4ea8-a158-cba205dab6da-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.216999 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kdzw\" (UniqueName: \"kubernetes.io/projected/83603474-dc08-4ea8-a158-cba205dab6da-kube-api-access-7kdzw\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.218208 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83603474-dc08-4ea8-a158-cba205dab6da-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"83603474-dc08-4ea8-a158-cba205dab6da\") " pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.298217 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.298510 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.298538 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-run\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.298556 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.298575 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.298596 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.298611 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.298628 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-config-data-custom\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.298650 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn5ws\" (UniqueName: \"kubernetes.io/projected/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-kube-api-access-nn5ws\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.298673 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-scripts\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.298693 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-etc-nvme\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.298724 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-ceph\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.298743 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-config-data\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.298759 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-lib-modules\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.298791 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-sys\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.298821 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-dev\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.298906 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-dev\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.298942 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-run\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.298963 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.298984 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.299022 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.299054 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.298389 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.299479 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-lib-modules\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.301378 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-sys\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.301723 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-etc-nvme\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.303235 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-config-data\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.305737 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-scripts\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.307163 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.309959 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-config-data-custom\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.330794 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-ceph\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.337144 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn5ws\" (UniqueName: \"kubernetes.io/projected/cb0c455b-a5d4-41cf-87c3-673a3deac7cb-kube-api-access-nn5ws\") pod \"cinder-backup-0\" (UID: \"cb0c455b-a5d4-41cf-87c3-673a3deac7cb\") " pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.352550 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.372045 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.625978 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-mkqmt"] Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.631659 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-mkqmt" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.645256 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-mkqmt"] Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.655658 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-3642-account-create-update-p2zgg"] Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.657020 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-3642-account-create-update-p2zgg" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.659506 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.677045 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-3642-account-create-update-p2zgg"] Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.727087 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a66b5e77-5e95-4fd0-957c-9d57f62a2238-operator-scripts\") pod \"manila-db-create-mkqmt\" (UID: \"a66b5e77-5e95-4fd0-957c-9d57f62a2238\") " pod="openstack/manila-db-create-mkqmt" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.727164 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76djn\" (UniqueName: \"kubernetes.io/projected/a66b5e77-5e95-4fd0-957c-9d57f62a2238-kube-api-access-76djn\") pod \"manila-db-create-mkqmt\" (UID: \"a66b5e77-5e95-4fd0-957c-9d57f62a2238\") " pod="openstack/manila-db-create-mkqmt" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.763367 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6468fb5467-v7gmh"] Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.764792 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6468fb5467-v7gmh"] Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.764993 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6468fb5467-v7gmh" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.774638 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.776502 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.778098 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.778309 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-xsd82" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.778492 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.778535 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.783892 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.784188 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.784529 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-kflk9" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.784565 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.793989 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.815260 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.816802 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.822748 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.822952 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.824840 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-769b6f4d57-mlfdl"] Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.831427 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf7bq\" (UniqueName: \"kubernetes.io/projected/f597ab05-4236-4a1c-95cb-3ce637a2dd52-kube-api-access-zf7bq\") pod \"manila-3642-account-create-update-p2zgg\" (UID: \"f597ab05-4236-4a1c-95cb-3ce637a2dd52\") " pod="openstack/manila-3642-account-create-update-p2zgg" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.831519 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a66b5e77-5e95-4fd0-957c-9d57f62a2238-operator-scripts\") pod \"manila-db-create-mkqmt\" (UID: \"a66b5e77-5e95-4fd0-957c-9d57f62a2238\") " pod="openstack/manila-db-create-mkqmt" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.831591 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76djn\" (UniqueName: \"kubernetes.io/projected/a66b5e77-5e95-4fd0-957c-9d57f62a2238-kube-api-access-76djn\") pod \"manila-db-create-mkqmt\" (UID: \"a66b5e77-5e95-4fd0-957c-9d57f62a2238\") " pod="openstack/manila-db-create-mkqmt" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.831622 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f597ab05-4236-4a1c-95cb-3ce637a2dd52-operator-scripts\") pod \"manila-3642-account-create-update-p2zgg\" (UID: \"f597ab05-4236-4a1c-95cb-3ce637a2dd52\") " pod="openstack/manila-3642-account-create-update-p2zgg" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.833823 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a66b5e77-5e95-4fd0-957c-9d57f62a2238-operator-scripts\") pod \"manila-db-create-mkqmt\" (UID: \"a66b5e77-5e95-4fd0-957c-9d57f62a2238\") " pod="openstack/manila-db-create-mkqmt" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.835639 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-769b6f4d57-mlfdl"] Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.835841 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-769b6f4d57-mlfdl" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.850381 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.881730 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76djn\" (UniqueName: \"kubernetes.io/projected/a66b5e77-5e95-4fd0-957c-9d57f62a2238-kube-api-access-76djn\") pod \"manila-db-create-mkqmt\" (UID: \"a66b5e77-5e95-4fd0-957c-9d57f62a2238\") " pod="openstack/manila-db-create-mkqmt" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.942380 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-logs\") pod \"horizon-769b6f4d57-mlfdl\" (UID: \"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9\") " pod="openstack/horizon-769b6f4d57-mlfdl" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.942431 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b76e652f-2feb-41c9-b8f6-16efc7080094-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.942462 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c071511c-6a08-4631-b449-92a8a12d69f4-scripts\") pod \"horizon-6468fb5467-v7gmh\" (UID: \"c071511c-6a08-4631-b449-92a8a12d69f4\") " pod="openstack/horizon-6468fb5467-v7gmh" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.942513 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c071511c-6a08-4631-b449-92a8a12d69f4-horizon-secret-key\") pod \"horizon-6468fb5467-v7gmh\" (UID: \"c071511c-6a08-4631-b449-92a8a12d69f4\") " pod="openstack/horizon-6468fb5467-v7gmh" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.942536 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b76e652f-2feb-41c9-b8f6-16efc7080094-config-data\") pod \"glance-default-external-api-0\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.942555 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51562a38-c06f-470d-9224-ea64998f81c6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.942575 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq2qf\" (UniqueName: \"kubernetes.io/projected/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-kube-api-access-qq2qf\") pod \"horizon-769b6f4d57-mlfdl\" (UID: \"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9\") " pod="openstack/horizon-769b6f4d57-mlfdl" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.942598 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b76e652f-2feb-41c9-b8f6-16efc7080094-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.942639 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-horizon-secret-key\") pod \"horizon-769b6f4d57-mlfdl\" (UID: \"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9\") " pod="openstack/horizon-769b6f4d57-mlfdl" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.942655 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51562a38-c06f-470d-9224-ea64998f81c6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.942681 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51562a38-c06f-470d-9224-ea64998f81c6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.942702 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.942723 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/51562a38-c06f-470d-9224-ea64998f81c6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.942745 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-scripts\") pod \"horizon-769b6f4d57-mlfdl\" (UID: \"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9\") " pod="openstack/horizon-769b6f4d57-mlfdl" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.942772 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f597ab05-4236-4a1c-95cb-3ce637a2dd52-operator-scripts\") pod \"manila-3642-account-create-update-p2zgg\" (UID: \"f597ab05-4236-4a1c-95cb-3ce637a2dd52\") " pod="openstack/manila-3642-account-create-update-p2zgg" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.942808 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b76e652f-2feb-41c9-b8f6-16efc7080094-scripts\") pod \"glance-default-external-api-0\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.942833 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-config-data\") pod \"horizon-769b6f4d57-mlfdl\" (UID: \"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9\") " pod="openstack/horizon-769b6f4d57-mlfdl" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.942853 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b76e652f-2feb-41c9-b8f6-16efc7080094-logs\") pod \"glance-default-external-api-0\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.942882 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.942920 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b76e652f-2feb-41c9-b8f6-16efc7080094-ceph\") pod \"glance-default-external-api-0\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.942951 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c071511c-6a08-4631-b449-92a8a12d69f4-config-data\") pod \"horizon-6468fb5467-v7gmh\" (UID: \"c071511c-6a08-4631-b449-92a8a12d69f4\") " pod="openstack/horizon-6468fb5467-v7gmh" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.942971 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c071511c-6a08-4631-b449-92a8a12d69f4-logs\") pod \"horizon-6468fb5467-v7gmh\" (UID: \"c071511c-6a08-4631-b449-92a8a12d69f4\") " pod="openstack/horizon-6468fb5467-v7gmh" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.942988 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxqkd\" (UniqueName: \"kubernetes.io/projected/c071511c-6a08-4631-b449-92a8a12d69f4-kube-api-access-xxqkd\") pod \"horizon-6468fb5467-v7gmh\" (UID: \"c071511c-6a08-4631-b449-92a8a12d69f4\") " pod="openstack/horizon-6468fb5467-v7gmh" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.943010 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51562a38-c06f-470d-9224-ea64998f81c6-logs\") pod \"glance-default-internal-api-0\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.943039 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf7bq\" (UniqueName: \"kubernetes.io/projected/f597ab05-4236-4a1c-95cb-3ce637a2dd52-kube-api-access-zf7bq\") pod \"manila-3642-account-create-update-p2zgg\" (UID: \"f597ab05-4236-4a1c-95cb-3ce637a2dd52\") " pod="openstack/manila-3642-account-create-update-p2zgg" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.943093 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/51562a38-c06f-470d-9224-ea64998f81c6-ceph\") pod \"glance-default-internal-api-0\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.943136 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqqt8\" (UniqueName: \"kubernetes.io/projected/b76e652f-2feb-41c9-b8f6-16efc7080094-kube-api-access-gqqt8\") pod \"glance-default-external-api-0\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.943151 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/51562a38-c06f-470d-9224-ea64998f81c6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.943770 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnrb8\" (UniqueName: \"kubernetes.io/projected/51562a38-c06f-470d-9224-ea64998f81c6-kube-api-access-jnrb8\") pod \"glance-default-internal-api-0\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.943803 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b76e652f-2feb-41c9-b8f6-16efc7080094-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.948300 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f597ab05-4236-4a1c-95cb-3ce637a2dd52-operator-scripts\") pod \"manila-3642-account-create-update-p2zgg\" (UID: \"f597ab05-4236-4a1c-95cb-3ce637a2dd52\") " pod="openstack/manila-3642-account-create-update-p2zgg" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.979289 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-mkqmt" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.980499 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf7bq\" (UniqueName: \"kubernetes.io/projected/f597ab05-4236-4a1c-95cb-3ce637a2dd52-kube-api-access-zf7bq\") pod \"manila-3642-account-create-update-p2zgg\" (UID: \"f597ab05-4236-4a1c-95cb-3ce637a2dd52\") " pod="openstack/manila-3642-account-create-update-p2zgg" Dec 01 20:49:08 crc kubenswrapper[4802]: I1201 20:49:08.994631 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-3642-account-create-update-p2zgg" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.045863 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c071511c-6a08-4631-b449-92a8a12d69f4-horizon-secret-key\") pod \"horizon-6468fb5467-v7gmh\" (UID: \"c071511c-6a08-4631-b449-92a8a12d69f4\") " pod="openstack/horizon-6468fb5467-v7gmh" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.045920 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b76e652f-2feb-41c9-b8f6-16efc7080094-config-data\") pod \"glance-default-external-api-0\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.045951 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51562a38-c06f-470d-9224-ea64998f81c6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.045978 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq2qf\" (UniqueName: \"kubernetes.io/projected/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-kube-api-access-qq2qf\") pod \"horizon-769b6f4d57-mlfdl\" (UID: \"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9\") " pod="openstack/horizon-769b6f4d57-mlfdl" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.046009 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b76e652f-2feb-41c9-b8f6-16efc7080094-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.046051 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-horizon-secret-key\") pod \"horizon-769b6f4d57-mlfdl\" (UID: \"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9\") " pod="openstack/horizon-769b6f4d57-mlfdl" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.046073 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51562a38-c06f-470d-9224-ea64998f81c6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.046097 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51562a38-c06f-470d-9224-ea64998f81c6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.046114 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.046131 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/51562a38-c06f-470d-9224-ea64998f81c6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.046150 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-scripts\") pod \"horizon-769b6f4d57-mlfdl\" (UID: \"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9\") " pod="openstack/horizon-769b6f4d57-mlfdl" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.046183 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b76e652f-2feb-41c9-b8f6-16efc7080094-scripts\") pod \"glance-default-external-api-0\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.046216 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-config-data\") pod \"horizon-769b6f4d57-mlfdl\" (UID: \"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9\") " pod="openstack/horizon-769b6f4d57-mlfdl" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.046232 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b76e652f-2feb-41c9-b8f6-16efc7080094-logs\") pod \"glance-default-external-api-0\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.046257 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.046285 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b76e652f-2feb-41c9-b8f6-16efc7080094-ceph\") pod \"glance-default-external-api-0\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.046313 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c071511c-6a08-4631-b449-92a8a12d69f4-config-data\") pod \"horizon-6468fb5467-v7gmh\" (UID: \"c071511c-6a08-4631-b449-92a8a12d69f4\") " pod="openstack/horizon-6468fb5467-v7gmh" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.046338 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c071511c-6a08-4631-b449-92a8a12d69f4-logs\") pod \"horizon-6468fb5467-v7gmh\" (UID: \"c071511c-6a08-4631-b449-92a8a12d69f4\") " pod="openstack/horizon-6468fb5467-v7gmh" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.046359 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxqkd\" (UniqueName: \"kubernetes.io/projected/c071511c-6a08-4631-b449-92a8a12d69f4-kube-api-access-xxqkd\") pod \"horizon-6468fb5467-v7gmh\" (UID: \"c071511c-6a08-4631-b449-92a8a12d69f4\") " pod="openstack/horizon-6468fb5467-v7gmh" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.046381 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51562a38-c06f-470d-9224-ea64998f81c6-logs\") pod \"glance-default-internal-api-0\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.046402 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/51562a38-c06f-470d-9224-ea64998f81c6-ceph\") pod \"glance-default-internal-api-0\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.046428 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/51562a38-c06f-470d-9224-ea64998f81c6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.046444 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnrb8\" (UniqueName: \"kubernetes.io/projected/51562a38-c06f-470d-9224-ea64998f81c6-kube-api-access-jnrb8\") pod \"glance-default-internal-api-0\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.046467 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqqt8\" (UniqueName: \"kubernetes.io/projected/b76e652f-2feb-41c9-b8f6-16efc7080094-kube-api-access-gqqt8\") pod \"glance-default-external-api-0\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.046485 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b76e652f-2feb-41c9-b8f6-16efc7080094-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.046518 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-logs\") pod \"horizon-769b6f4d57-mlfdl\" (UID: \"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9\") " pod="openstack/horizon-769b6f4d57-mlfdl" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.046568 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b76e652f-2feb-41c9-b8f6-16efc7080094-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.046588 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c071511c-6a08-4631-b449-92a8a12d69f4-scripts\") pod \"horizon-6468fb5467-v7gmh\" (UID: \"c071511c-6a08-4631-b449-92a8a12d69f4\") " pod="openstack/horizon-6468fb5467-v7gmh" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.047667 4802 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.048465 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c071511c-6a08-4631-b449-92a8a12d69f4-logs\") pod \"horizon-6468fb5467-v7gmh\" (UID: \"c071511c-6a08-4631-b449-92a8a12d69f4\") " pod="openstack/horizon-6468fb5467-v7gmh" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.049372 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c071511c-6a08-4631-b449-92a8a12d69f4-scripts\") pod \"horizon-6468fb5467-v7gmh\" (UID: \"c071511c-6a08-4631-b449-92a8a12d69f4\") " pod="openstack/horizon-6468fb5467-v7gmh" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.049977 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51562a38-c06f-470d-9224-ea64998f81c6-logs\") pod \"glance-default-internal-api-0\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.050238 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/51562a38-c06f-470d-9224-ea64998f81c6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.050259 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-config-data\") pod \"horizon-769b6f4d57-mlfdl\" (UID: \"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9\") " pod="openstack/horizon-769b6f4d57-mlfdl" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.050373 4802 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.050559 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b76e652f-2feb-41c9-b8f6-16efc7080094-logs\") pod \"glance-default-external-api-0\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.050507 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b76e652f-2feb-41c9-b8f6-16efc7080094-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.050767 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-scripts\") pod \"horizon-769b6f4d57-mlfdl\" (UID: \"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9\") " pod="openstack/horizon-769b6f4d57-mlfdl" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.055374 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51562a38-c06f-470d-9224-ea64998f81c6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.055990 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-logs\") pod \"horizon-769b6f4d57-mlfdl\" (UID: \"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9\") " pod="openstack/horizon-769b6f4d57-mlfdl" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.056288 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b76e652f-2feb-41c9-b8f6-16efc7080094-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.061223 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/51562a38-c06f-470d-9224-ea64998f81c6-ceph\") pod \"glance-default-internal-api-0\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.061682 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b76e652f-2feb-41c9-b8f6-16efc7080094-ceph\") pod \"glance-default-external-api-0\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.061769 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c071511c-6a08-4631-b449-92a8a12d69f4-horizon-secret-key\") pod \"horizon-6468fb5467-v7gmh\" (UID: \"c071511c-6a08-4631-b449-92a8a12d69f4\") " pod="openstack/horizon-6468fb5467-v7gmh" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.062630 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51562a38-c06f-470d-9224-ea64998f81c6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.062752 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/51562a38-c06f-470d-9224-ea64998f81c6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.062914 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b76e652f-2feb-41c9-b8f6-16efc7080094-scripts\") pod \"glance-default-external-api-0\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.063003 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-horizon-secret-key\") pod \"horizon-769b6f4d57-mlfdl\" (UID: \"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9\") " pod="openstack/horizon-769b6f4d57-mlfdl" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.064976 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c071511c-6a08-4631-b449-92a8a12d69f4-config-data\") pod \"horizon-6468fb5467-v7gmh\" (UID: \"c071511c-6a08-4631-b449-92a8a12d69f4\") " pod="openstack/horizon-6468fb5467-v7gmh" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.065014 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b76e652f-2feb-41c9-b8f6-16efc7080094-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.066050 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51562a38-c06f-470d-9224-ea64998f81c6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.074458 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq2qf\" (UniqueName: \"kubernetes.io/projected/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-kube-api-access-qq2qf\") pod \"horizon-769b6f4d57-mlfdl\" (UID: \"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9\") " pod="openstack/horizon-769b6f4d57-mlfdl" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.075431 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b76e652f-2feb-41c9-b8f6-16efc7080094-config-data\") pod \"glance-default-external-api-0\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.084405 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnrb8\" (UniqueName: \"kubernetes.io/projected/51562a38-c06f-470d-9224-ea64998f81c6-kube-api-access-jnrb8\") pod \"glance-default-internal-api-0\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.091111 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqqt8\" (UniqueName: \"kubernetes.io/projected/b76e652f-2feb-41c9-b8f6-16efc7080094-kube-api-access-gqqt8\") pod \"glance-default-external-api-0\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.092261 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxqkd\" (UniqueName: \"kubernetes.io/projected/c071511c-6a08-4631-b449-92a8a12d69f4-kube-api-access-xxqkd\") pod \"horizon-6468fb5467-v7gmh\" (UID: \"c071511c-6a08-4631-b449-92a8a12d69f4\") " pod="openstack/horizon-6468fb5467-v7gmh" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.125153 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6468fb5467-v7gmh" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.131362 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.131845 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.152575 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.187893 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.201096 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-769b6f4d57-mlfdl" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.203148 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.622242 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-mkqmt"] Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.662848 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-3642-account-create-update-p2zgg"] Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.720053 4802 scope.go:117] "RemoveContainer" containerID="538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098" Dec 01 20:49:09 crc kubenswrapper[4802]: E1201 20:49:09.720340 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.747986 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6468fb5467-v7gmh"] Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.836710 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-769b6f4d57-mlfdl"] Dec 01 20:49:09 crc kubenswrapper[4802]: W1201 20:49:09.846834 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc071511c_6a08_4631_b449_92a8a12d69f4.slice/crio-f0f30a383466134d7929b234e3c0de8c53db85ccee6a12099b3db3e603a63f73 WatchSource:0}: Error finding container f0f30a383466134d7929b234e3c0de8c53db85ccee6a12099b3db3e603a63f73: Status 404 returned error can't find the container with id f0f30a383466134d7929b234e3c0de8c53db85ccee6a12099b3db3e603a63f73 Dec 01 20:49:09 crc kubenswrapper[4802]: W1201 20:49:09.863659 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fdc24d0_b560_47a5_ad7d_d77d7581f7d9.slice/crio-1d75c112d515826420f4abe6d5deb3ce3ba43b2c24e157c2a7287fd547f819e4 WatchSource:0}: Error finding container 1d75c112d515826420f4abe6d5deb3ce3ba43b2c24e157c2a7287fd547f819e4: Status 404 returned error can't find the container with id 1d75c112d515826420f4abe6d5deb3ce3ba43b2c24e157c2a7287fd547f819e4 Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.879516 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.891610 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6468fb5467-v7gmh" event={"ID":"c071511c-6a08-4631-b449-92a8a12d69f4","Type":"ContainerStarted","Data":"f0f30a383466134d7929b234e3c0de8c53db85ccee6a12099b3db3e603a63f73"} Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.893388 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-mkqmt" event={"ID":"a66b5e77-5e95-4fd0-957c-9d57f62a2238","Type":"ContainerStarted","Data":"b2f3acd97461676808a653a3b050b665530451fec70dfcf833af3cbb45e36863"} Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.895014 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"cb0c455b-a5d4-41cf-87c3-673a3deac7cb","Type":"ContainerStarted","Data":"417b0784eb1212e9ac74e2bb7e9cdb0b9654d97d31b66b35d1a0f6646e785109"} Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.905076 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-3642-account-create-update-p2zgg" event={"ID":"f597ab05-4236-4a1c-95cb-3ce637a2dd52","Type":"ContainerStarted","Data":"cafa61e512dde38f393f13c69cef5e2e255ce2ace18d572438bd41933082311d"} Dec 01 20:49:09 crc kubenswrapper[4802]: I1201 20:49:09.913447 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-769b6f4d57-mlfdl" event={"ID":"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9","Type":"ContainerStarted","Data":"1d75c112d515826420f4abe6d5deb3ce3ba43b2c24e157c2a7287fd547f819e4"} Dec 01 20:49:10 crc kubenswrapper[4802]: I1201 20:49:10.169642 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 20:49:10 crc kubenswrapper[4802]: I1201 20:49:10.329528 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 01 20:49:10 crc kubenswrapper[4802]: W1201 20:49:10.414050 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83603474_dc08_4ea8_a158_cba205dab6da.slice/crio-6118ffddf13499552df2b8a9323b04133cc2000a93cc6492823f9286d255ca0e WatchSource:0}: Error finding container 6118ffddf13499552df2b8a9323b04133cc2000a93cc6492823f9286d255ca0e: Status 404 returned error can't find the container with id 6118ffddf13499552df2b8a9323b04133cc2000a93cc6492823f9286d255ca0e Dec 01 20:49:10 crc kubenswrapper[4802]: I1201 20:49:10.929480 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b76e652f-2feb-41c9-b8f6-16efc7080094","Type":"ContainerStarted","Data":"62fc542631130970b258aea9585d746749de032ae92eaf48182acbe39a655b11"} Dec 01 20:49:10 crc kubenswrapper[4802]: I1201 20:49:10.929790 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b76e652f-2feb-41c9-b8f6-16efc7080094","Type":"ContainerStarted","Data":"55c8da4ff5209bc5be588b7f5bba16de02edc5da5e8c972cad44f6d8875fb661"} Dec 01 20:49:10 crc kubenswrapper[4802]: I1201 20:49:10.932553 4802 generic.go:334] "Generic (PLEG): container finished" podID="a66b5e77-5e95-4fd0-957c-9d57f62a2238" containerID="70cf64eabf7d4d951d30023add8f6fcf45777ed5ddaccbe3a07ee94762c2e238" exitCode=0 Dec 01 20:49:10 crc kubenswrapper[4802]: I1201 20:49:10.932665 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-mkqmt" event={"ID":"a66b5e77-5e95-4fd0-957c-9d57f62a2238","Type":"ContainerDied","Data":"70cf64eabf7d4d951d30023add8f6fcf45777ed5ddaccbe3a07ee94762c2e238"} Dec 01 20:49:10 crc kubenswrapper[4802]: I1201 20:49:10.936022 4802 generic.go:334] "Generic (PLEG): container finished" podID="f597ab05-4236-4a1c-95cb-3ce637a2dd52" containerID="190ff2633167971da5c1baa380e299005a4c805ee6f539edfbe9697d88ec2773" exitCode=0 Dec 01 20:49:10 crc kubenswrapper[4802]: I1201 20:49:10.936115 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-3642-account-create-update-p2zgg" event={"ID":"f597ab05-4236-4a1c-95cb-3ce637a2dd52","Type":"ContainerDied","Data":"190ff2633167971da5c1baa380e299005a4c805ee6f539edfbe9697d88ec2773"} Dec 01 20:49:10 crc kubenswrapper[4802]: I1201 20:49:10.939724 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"51562a38-c06f-470d-9224-ea64998f81c6","Type":"ContainerStarted","Data":"38db279e8eef8a3b10c4c4715caae51809492ab6dab7bcda476d06df24fda29f"} Dec 01 20:49:10 crc kubenswrapper[4802]: I1201 20:49:10.939765 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"51562a38-c06f-470d-9224-ea64998f81c6","Type":"ContainerStarted","Data":"b47b399a8bf2dc6514724a656b7c15a1e308f8b27e9cd06ce440e81c971da219"} Dec 01 20:49:10 crc kubenswrapper[4802]: I1201 20:49:10.940829 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"83603474-dc08-4ea8-a158-cba205dab6da","Type":"ContainerStarted","Data":"6118ffddf13499552df2b8a9323b04133cc2000a93cc6492823f9286d255ca0e"} Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.145415 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6468fb5467-v7gmh"] Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.173221 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.221839 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-58f6ccd776-gr6kp"] Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.239638 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.250177 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.269878 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f2a61354-70f8-4e95-ab30-6e1d90128879-config-data\") pod \"horizon-58f6ccd776-gr6kp\" (UID: \"f2a61354-70f8-4e95-ab30-6e1d90128879\") " pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.270040 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2a61354-70f8-4e95-ab30-6e1d90128879-logs\") pod \"horizon-58f6ccd776-gr6kp\" (UID: \"f2a61354-70f8-4e95-ab30-6e1d90128879\") " pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.270111 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg6mq\" (UniqueName: \"kubernetes.io/projected/f2a61354-70f8-4e95-ab30-6e1d90128879-kube-api-access-lg6mq\") pod \"horizon-58f6ccd776-gr6kp\" (UID: \"f2a61354-70f8-4e95-ab30-6e1d90128879\") " pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.270203 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2a61354-70f8-4e95-ab30-6e1d90128879-scripts\") pod \"horizon-58f6ccd776-gr6kp\" (UID: \"f2a61354-70f8-4e95-ab30-6e1d90128879\") " pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.270262 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2a61354-70f8-4e95-ab30-6e1d90128879-horizon-tls-certs\") pod \"horizon-58f6ccd776-gr6kp\" (UID: \"f2a61354-70f8-4e95-ab30-6e1d90128879\") " pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.270301 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2a61354-70f8-4e95-ab30-6e1d90128879-combined-ca-bundle\") pod \"horizon-58f6ccd776-gr6kp\" (UID: \"f2a61354-70f8-4e95-ab30-6e1d90128879\") " pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.270486 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f2a61354-70f8-4e95-ab30-6e1d90128879-horizon-secret-key\") pod \"horizon-58f6ccd776-gr6kp\" (UID: \"f2a61354-70f8-4e95-ab30-6e1d90128879\") " pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.306672 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58f6ccd776-gr6kp"] Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.323230 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-769b6f4d57-mlfdl"] Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.340593 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.359709 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-b69f75cb8-xrkks"] Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.361797 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b69f75cb8-xrkks" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.378741 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f2a61354-70f8-4e95-ab30-6e1d90128879-config-data\") pod \"horizon-58f6ccd776-gr6kp\" (UID: \"f2a61354-70f8-4e95-ab30-6e1d90128879\") " pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.378779 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b69f75cb8-xrkks"] Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.378807 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2a61354-70f8-4e95-ab30-6e1d90128879-logs\") pod \"horizon-58f6ccd776-gr6kp\" (UID: \"f2a61354-70f8-4e95-ab30-6e1d90128879\") " pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.378857 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg6mq\" (UniqueName: \"kubernetes.io/projected/f2a61354-70f8-4e95-ab30-6e1d90128879-kube-api-access-lg6mq\") pod \"horizon-58f6ccd776-gr6kp\" (UID: \"f2a61354-70f8-4e95-ab30-6e1d90128879\") " pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.378897 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2a61354-70f8-4e95-ab30-6e1d90128879-scripts\") pod \"horizon-58f6ccd776-gr6kp\" (UID: \"f2a61354-70f8-4e95-ab30-6e1d90128879\") " pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.378925 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2a61354-70f8-4e95-ab30-6e1d90128879-horizon-tls-certs\") pod \"horizon-58f6ccd776-gr6kp\" (UID: \"f2a61354-70f8-4e95-ab30-6e1d90128879\") " pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.378950 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2a61354-70f8-4e95-ab30-6e1d90128879-combined-ca-bundle\") pod \"horizon-58f6ccd776-gr6kp\" (UID: \"f2a61354-70f8-4e95-ab30-6e1d90128879\") " pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.379012 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f2a61354-70f8-4e95-ab30-6e1d90128879-horizon-secret-key\") pod \"horizon-58f6ccd776-gr6kp\" (UID: \"f2a61354-70f8-4e95-ab30-6e1d90128879\") " pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.379623 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2a61354-70f8-4e95-ab30-6e1d90128879-logs\") pod \"horizon-58f6ccd776-gr6kp\" (UID: \"f2a61354-70f8-4e95-ab30-6e1d90128879\") " pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.382795 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f2a61354-70f8-4e95-ab30-6e1d90128879-config-data\") pod \"horizon-58f6ccd776-gr6kp\" (UID: \"f2a61354-70f8-4e95-ab30-6e1d90128879\") " pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.384381 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2a61354-70f8-4e95-ab30-6e1d90128879-scripts\") pod \"horizon-58f6ccd776-gr6kp\" (UID: \"f2a61354-70f8-4e95-ab30-6e1d90128879\") " pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.386376 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2a61354-70f8-4e95-ab30-6e1d90128879-combined-ca-bundle\") pod \"horizon-58f6ccd776-gr6kp\" (UID: \"f2a61354-70f8-4e95-ab30-6e1d90128879\") " pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.386545 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f2a61354-70f8-4e95-ab30-6e1d90128879-horizon-secret-key\") pod \"horizon-58f6ccd776-gr6kp\" (UID: \"f2a61354-70f8-4e95-ab30-6e1d90128879\") " pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.392623 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2a61354-70f8-4e95-ab30-6e1d90128879-horizon-tls-certs\") pod \"horizon-58f6ccd776-gr6kp\" (UID: \"f2a61354-70f8-4e95-ab30-6e1d90128879\") " pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.398206 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg6mq\" (UniqueName: \"kubernetes.io/projected/f2a61354-70f8-4e95-ab30-6e1d90128879-kube-api-access-lg6mq\") pod \"horizon-58f6ccd776-gr6kp\" (UID: \"f2a61354-70f8-4e95-ab30-6e1d90128879\") " pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.482651 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40185112-89e4-49c3-9ccc-0b190724c5ff-config-data\") pod \"horizon-b69f75cb8-xrkks\" (UID: \"40185112-89e4-49c3-9ccc-0b190724c5ff\") " pod="openstack/horizon-b69f75cb8-xrkks" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.482749 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/40185112-89e4-49c3-9ccc-0b190724c5ff-horizon-tls-certs\") pod \"horizon-b69f75cb8-xrkks\" (UID: \"40185112-89e4-49c3-9ccc-0b190724c5ff\") " pod="openstack/horizon-b69f75cb8-xrkks" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.482810 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngd4j\" (UniqueName: \"kubernetes.io/projected/40185112-89e4-49c3-9ccc-0b190724c5ff-kube-api-access-ngd4j\") pod \"horizon-b69f75cb8-xrkks\" (UID: \"40185112-89e4-49c3-9ccc-0b190724c5ff\") " pod="openstack/horizon-b69f75cb8-xrkks" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.482834 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40185112-89e4-49c3-9ccc-0b190724c5ff-logs\") pod \"horizon-b69f75cb8-xrkks\" (UID: \"40185112-89e4-49c3-9ccc-0b190724c5ff\") " pod="openstack/horizon-b69f75cb8-xrkks" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.482871 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/40185112-89e4-49c3-9ccc-0b190724c5ff-horizon-secret-key\") pod \"horizon-b69f75cb8-xrkks\" (UID: \"40185112-89e4-49c3-9ccc-0b190724c5ff\") " pod="openstack/horizon-b69f75cb8-xrkks" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.482886 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40185112-89e4-49c3-9ccc-0b190724c5ff-combined-ca-bundle\") pod \"horizon-b69f75cb8-xrkks\" (UID: \"40185112-89e4-49c3-9ccc-0b190724c5ff\") " pod="openstack/horizon-b69f75cb8-xrkks" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.482959 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40185112-89e4-49c3-9ccc-0b190724c5ff-scripts\") pod \"horizon-b69f75cb8-xrkks\" (UID: \"40185112-89e4-49c3-9ccc-0b190724c5ff\") " pod="openstack/horizon-b69f75cb8-xrkks" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.584740 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngd4j\" (UniqueName: \"kubernetes.io/projected/40185112-89e4-49c3-9ccc-0b190724c5ff-kube-api-access-ngd4j\") pod \"horizon-b69f75cb8-xrkks\" (UID: \"40185112-89e4-49c3-9ccc-0b190724c5ff\") " pod="openstack/horizon-b69f75cb8-xrkks" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.584778 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40185112-89e4-49c3-9ccc-0b190724c5ff-logs\") pod \"horizon-b69f75cb8-xrkks\" (UID: \"40185112-89e4-49c3-9ccc-0b190724c5ff\") " pod="openstack/horizon-b69f75cb8-xrkks" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.584817 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/40185112-89e4-49c3-9ccc-0b190724c5ff-horizon-secret-key\") pod \"horizon-b69f75cb8-xrkks\" (UID: \"40185112-89e4-49c3-9ccc-0b190724c5ff\") " pod="openstack/horizon-b69f75cb8-xrkks" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.584832 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40185112-89e4-49c3-9ccc-0b190724c5ff-combined-ca-bundle\") pod \"horizon-b69f75cb8-xrkks\" (UID: \"40185112-89e4-49c3-9ccc-0b190724c5ff\") " pod="openstack/horizon-b69f75cb8-xrkks" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.584885 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40185112-89e4-49c3-9ccc-0b190724c5ff-scripts\") pod \"horizon-b69f75cb8-xrkks\" (UID: \"40185112-89e4-49c3-9ccc-0b190724c5ff\") " pod="openstack/horizon-b69f75cb8-xrkks" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.584942 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40185112-89e4-49c3-9ccc-0b190724c5ff-config-data\") pod \"horizon-b69f75cb8-xrkks\" (UID: \"40185112-89e4-49c3-9ccc-0b190724c5ff\") " pod="openstack/horizon-b69f75cb8-xrkks" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.584981 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/40185112-89e4-49c3-9ccc-0b190724c5ff-horizon-tls-certs\") pod \"horizon-b69f75cb8-xrkks\" (UID: \"40185112-89e4-49c3-9ccc-0b190724c5ff\") " pod="openstack/horizon-b69f75cb8-xrkks" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.587010 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40185112-89e4-49c3-9ccc-0b190724c5ff-scripts\") pod \"horizon-b69f75cb8-xrkks\" (UID: \"40185112-89e4-49c3-9ccc-0b190724c5ff\") " pod="openstack/horizon-b69f75cb8-xrkks" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.588075 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40185112-89e4-49c3-9ccc-0b190724c5ff-config-data\") pod \"horizon-b69f75cb8-xrkks\" (UID: \"40185112-89e4-49c3-9ccc-0b190724c5ff\") " pod="openstack/horizon-b69f75cb8-xrkks" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.588911 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40185112-89e4-49c3-9ccc-0b190724c5ff-logs\") pod \"horizon-b69f75cb8-xrkks\" (UID: \"40185112-89e4-49c3-9ccc-0b190724c5ff\") " pod="openstack/horizon-b69f75cb8-xrkks" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.597796 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/40185112-89e4-49c3-9ccc-0b190724c5ff-horizon-secret-key\") pod \"horizon-b69f75cb8-xrkks\" (UID: \"40185112-89e4-49c3-9ccc-0b190724c5ff\") " pod="openstack/horizon-b69f75cb8-xrkks" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.598048 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/40185112-89e4-49c3-9ccc-0b190724c5ff-horizon-tls-certs\") pod \"horizon-b69f75cb8-xrkks\" (UID: \"40185112-89e4-49c3-9ccc-0b190724c5ff\") " pod="openstack/horizon-b69f75cb8-xrkks" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.599204 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40185112-89e4-49c3-9ccc-0b190724c5ff-combined-ca-bundle\") pod \"horizon-b69f75cb8-xrkks\" (UID: \"40185112-89e4-49c3-9ccc-0b190724c5ff\") " pod="openstack/horizon-b69f75cb8-xrkks" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.607650 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngd4j\" (UniqueName: \"kubernetes.io/projected/40185112-89e4-49c3-9ccc-0b190724c5ff-kube-api-access-ngd4j\") pod \"horizon-b69f75cb8-xrkks\" (UID: \"40185112-89e4-49c3-9ccc-0b190724c5ff\") " pod="openstack/horizon-b69f75cb8-xrkks" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.624108 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b69f75cb8-xrkks" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.637649 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.964465 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"cb0c455b-a5d4-41cf-87c3-673a3deac7cb","Type":"ContainerStarted","Data":"0cd6279e47d47166cd460fb0a7fc13c832e9d9f32e6db2f467e5b779c2393290"} Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.969150 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"51562a38-c06f-470d-9224-ea64998f81c6","Type":"ContainerStarted","Data":"5b81fa39ee48452b2079418353cfa5e180018150ee65d85925e011430a1eaa73"} Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.969306 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="51562a38-c06f-470d-9224-ea64998f81c6" containerName="glance-log" containerID="cri-o://38db279e8eef8a3b10c4c4715caae51809492ab6dab7bcda476d06df24fda29f" gracePeriod=30 Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.969364 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="51562a38-c06f-470d-9224-ea64998f81c6" containerName="glance-httpd" containerID="cri-o://5b81fa39ee48452b2079418353cfa5e180018150ee65d85925e011430a1eaa73" gracePeriod=30 Dec 01 20:49:11 crc kubenswrapper[4802]: I1201 20:49:11.973669 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"83603474-dc08-4ea8-a158-cba205dab6da","Type":"ContainerStarted","Data":"434215ac272c3e98d9bd87c25112e456dcb614d90b5e06ca790a0de107d37ca4"} Dec 01 20:49:12 crc kubenswrapper[4802]: I1201 20:49:12.005262 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.005240625 podStartE2EDuration="4.005240625s" podCreationTimestamp="2025-12-01 20:49:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:49:12.001235745 +0000 UTC m=+3173.563795416" watchObservedRunningTime="2025-12-01 20:49:12.005240625 +0000 UTC m=+3173.567800306" Dec 01 20:49:12 crc kubenswrapper[4802]: I1201 20:49:12.196441 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58f6ccd776-gr6kp"] Dec 01 20:49:12 crc kubenswrapper[4802]: W1201 20:49:12.224411 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2a61354_70f8_4e95_ab30_6e1d90128879.slice/crio-453c089c12f0c993eda2a9db6f02229a6c2f6ff40d03ad1dd2c9798f24ed434d WatchSource:0}: Error finding container 453c089c12f0c993eda2a9db6f02229a6c2f6ff40d03ad1dd2c9798f24ed434d: Status 404 returned error can't find the container with id 453c089c12f0c993eda2a9db6f02229a6c2f6ff40d03ad1dd2c9798f24ed434d Dec 01 20:49:12 crc kubenswrapper[4802]: I1201 20:49:12.533749 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b69f75cb8-xrkks"] Dec 01 20:49:12 crc kubenswrapper[4802]: I1201 20:49:12.725308 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-mkqmt" Dec 01 20:49:12 crc kubenswrapper[4802]: I1201 20:49:12.738299 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a66b5e77-5e95-4fd0-957c-9d57f62a2238-operator-scripts\") pod \"a66b5e77-5e95-4fd0-957c-9d57f62a2238\" (UID: \"a66b5e77-5e95-4fd0-957c-9d57f62a2238\") " Dec 01 20:49:12 crc kubenswrapper[4802]: I1201 20:49:12.738371 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76djn\" (UniqueName: \"kubernetes.io/projected/a66b5e77-5e95-4fd0-957c-9d57f62a2238-kube-api-access-76djn\") pod \"a66b5e77-5e95-4fd0-957c-9d57f62a2238\" (UID: \"a66b5e77-5e95-4fd0-957c-9d57f62a2238\") " Dec 01 20:49:12 crc kubenswrapper[4802]: I1201 20:49:12.743788 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a66b5e77-5e95-4fd0-957c-9d57f62a2238-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a66b5e77-5e95-4fd0-957c-9d57f62a2238" (UID: "a66b5e77-5e95-4fd0-957c-9d57f62a2238"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:49:12 crc kubenswrapper[4802]: I1201 20:49:12.770934 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-3642-account-create-update-p2zgg" Dec 01 20:49:12 crc kubenswrapper[4802]: I1201 20:49:12.792112 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a66b5e77-5e95-4fd0-957c-9d57f62a2238-kube-api-access-76djn" (OuterVolumeSpecName: "kube-api-access-76djn") pod "a66b5e77-5e95-4fd0-957c-9d57f62a2238" (UID: "a66b5e77-5e95-4fd0-957c-9d57f62a2238"). InnerVolumeSpecName "kube-api-access-76djn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:49:12 crc kubenswrapper[4802]: I1201 20:49:12.842817 4802 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a66b5e77-5e95-4fd0-957c-9d57f62a2238-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:12 crc kubenswrapper[4802]: I1201 20:49:12.842849 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76djn\" (UniqueName: \"kubernetes.io/projected/a66b5e77-5e95-4fd0-957c-9d57f62a2238-kube-api-access-76djn\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:12 crc kubenswrapper[4802]: I1201 20:49:12.943561 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 20:49:12 crc kubenswrapper[4802]: I1201 20:49:12.945014 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf7bq\" (UniqueName: \"kubernetes.io/projected/f597ab05-4236-4a1c-95cb-3ce637a2dd52-kube-api-access-zf7bq\") pod \"f597ab05-4236-4a1c-95cb-3ce637a2dd52\" (UID: \"f597ab05-4236-4a1c-95cb-3ce637a2dd52\") " Dec 01 20:49:12 crc kubenswrapper[4802]: I1201 20:49:12.945246 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f597ab05-4236-4a1c-95cb-3ce637a2dd52-operator-scripts\") pod \"f597ab05-4236-4a1c-95cb-3ce637a2dd52\" (UID: \"f597ab05-4236-4a1c-95cb-3ce637a2dd52\") " Dec 01 20:49:12 crc kubenswrapper[4802]: I1201 20:49:12.946287 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f597ab05-4236-4a1c-95cb-3ce637a2dd52-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f597ab05-4236-4a1c-95cb-3ce637a2dd52" (UID: "f597ab05-4236-4a1c-95cb-3ce637a2dd52"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:49:12 crc kubenswrapper[4802]: I1201 20:49:12.949914 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f597ab05-4236-4a1c-95cb-3ce637a2dd52-kube-api-access-zf7bq" (OuterVolumeSpecName: "kube-api-access-zf7bq") pod "f597ab05-4236-4a1c-95cb-3ce637a2dd52" (UID: "f597ab05-4236-4a1c-95cb-3ce637a2dd52"). InnerVolumeSpecName "kube-api-access-zf7bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:12.999247 4802 generic.go:334] "Generic (PLEG): container finished" podID="51562a38-c06f-470d-9224-ea64998f81c6" containerID="5b81fa39ee48452b2079418353cfa5e180018150ee65d85925e011430a1eaa73" exitCode=143 Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:12.999273 4802 generic.go:334] "Generic (PLEG): container finished" podID="51562a38-c06f-470d-9224-ea64998f81c6" containerID="38db279e8eef8a3b10c4c4715caae51809492ab6dab7bcda476d06df24fda29f" exitCode=143 Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:12.999308 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"51562a38-c06f-470d-9224-ea64998f81c6","Type":"ContainerDied","Data":"5b81fa39ee48452b2079418353cfa5e180018150ee65d85925e011430a1eaa73"} Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:12.999330 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"51562a38-c06f-470d-9224-ea64998f81c6","Type":"ContainerDied","Data":"38db279e8eef8a3b10c4c4715caae51809492ab6dab7bcda476d06df24fda29f"} Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:12.999341 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"51562a38-c06f-470d-9224-ea64998f81c6","Type":"ContainerDied","Data":"b47b399a8bf2dc6514724a656b7c15a1e308f8b27e9cd06ce440e81c971da219"} Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:12.999355 4802 scope.go:117] "RemoveContainer" containerID="5b81fa39ee48452b2079418353cfa5e180018150ee65d85925e011430a1eaa73" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:12.999460 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.010717 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"83603474-dc08-4ea8-a158-cba205dab6da","Type":"ContainerStarted","Data":"81d3eea68f619ad6f7e3ff85c221d111fb536a8a29e7be6218ea14083cc0dea4"} Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.014997 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b76e652f-2feb-41c9-b8f6-16efc7080094","Type":"ContainerStarted","Data":"d4d077dbe36470e14b21786447c24eebd47a54009b16645e38737b25b277aa24"} Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.015108 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b76e652f-2feb-41c9-b8f6-16efc7080094" containerName="glance-log" containerID="cri-o://62fc542631130970b258aea9585d746749de032ae92eaf48182acbe39a655b11" gracePeriod=30 Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.015323 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b76e652f-2feb-41c9-b8f6-16efc7080094" containerName="glance-httpd" containerID="cri-o://d4d077dbe36470e14b21786447c24eebd47a54009b16645e38737b25b277aa24" gracePeriod=30 Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.020211 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58f6ccd776-gr6kp" event={"ID":"f2a61354-70f8-4e95-ab30-6e1d90128879","Type":"ContainerStarted","Data":"453c089c12f0c993eda2a9db6f02229a6c2f6ff40d03ad1dd2c9798f24ed434d"} Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.027480 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-mkqmt" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.027524 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-mkqmt" event={"ID":"a66b5e77-5e95-4fd0-957c-9d57f62a2238","Type":"ContainerDied","Data":"b2f3acd97461676808a653a3b050b665530451fec70dfcf833af3cbb45e36863"} Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.027579 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2f3acd97461676808a653a3b050b665530451fec70dfcf833af3cbb45e36863" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.030958 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"cb0c455b-a5d4-41cf-87c3-673a3deac7cb","Type":"ContainerStarted","Data":"68c3f10032e0727bab279325d00141d4fb8212b6e7d02ea9f1da40273fc75a03"} Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.044774 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-3642-account-create-update-p2zgg" event={"ID":"f597ab05-4236-4a1c-95cb-3ce637a2dd52","Type":"ContainerDied","Data":"cafa61e512dde38f393f13c69cef5e2e255ce2ace18d572438bd41933082311d"} Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.044818 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cafa61e512dde38f393f13c69cef5e2e255ce2ace18d572438bd41933082311d" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.044910 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-3642-account-create-update-p2zgg" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.045678 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=4.035114144 podStartE2EDuration="5.04566483s" podCreationTimestamp="2025-12-01 20:49:08 +0000 UTC" firstStartedPulling="2025-12-01 20:49:10.416504746 +0000 UTC m=+3171.979064387" lastFinishedPulling="2025-12-01 20:49:11.427055432 +0000 UTC m=+3172.989615073" observedRunningTime="2025-12-01 20:49:13.044739953 +0000 UTC m=+3174.607299604" watchObservedRunningTime="2025-12-01 20:49:13.04566483 +0000 UTC m=+3174.608224471" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.047212 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51562a38-c06f-470d-9224-ea64998f81c6-logs\") pod \"51562a38-c06f-470d-9224-ea64998f81c6\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.047270 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51562a38-c06f-470d-9224-ea64998f81c6-scripts\") pod \"51562a38-c06f-470d-9224-ea64998f81c6\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.047312 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51562a38-c06f-470d-9224-ea64998f81c6-config-data\") pod \"51562a38-c06f-470d-9224-ea64998f81c6\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.047334 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"51562a38-c06f-470d-9224-ea64998f81c6\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.047362 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/51562a38-c06f-470d-9224-ea64998f81c6-ceph\") pod \"51562a38-c06f-470d-9224-ea64998f81c6\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.047388 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51562a38-c06f-470d-9224-ea64998f81c6-combined-ca-bundle\") pod \"51562a38-c06f-470d-9224-ea64998f81c6\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.047413 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnrb8\" (UniqueName: \"kubernetes.io/projected/51562a38-c06f-470d-9224-ea64998f81c6-kube-api-access-jnrb8\") pod \"51562a38-c06f-470d-9224-ea64998f81c6\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.047525 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/51562a38-c06f-470d-9224-ea64998f81c6-httpd-run\") pod \"51562a38-c06f-470d-9224-ea64998f81c6\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.047551 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/51562a38-c06f-470d-9224-ea64998f81c6-internal-tls-certs\") pod \"51562a38-c06f-470d-9224-ea64998f81c6\" (UID: \"51562a38-c06f-470d-9224-ea64998f81c6\") " Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.051017 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51562a38-c06f-470d-9224-ea64998f81c6-logs" (OuterVolumeSpecName: "logs") pod "51562a38-c06f-470d-9224-ea64998f81c6" (UID: "51562a38-c06f-470d-9224-ea64998f81c6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.059591 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b69f75cb8-xrkks" event={"ID":"40185112-89e4-49c3-9ccc-0b190724c5ff","Type":"ContainerStarted","Data":"6e6f2e83e5b5c297bed9954e56716dadc5a3ddf089b5e6f50a566a2871b58959"} Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.059851 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51562a38-c06f-470d-9224-ea64998f81c6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "51562a38-c06f-470d-9224-ea64998f81c6" (UID: "51562a38-c06f-470d-9224-ea64998f81c6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.061451 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf7bq\" (UniqueName: \"kubernetes.io/projected/f597ab05-4236-4a1c-95cb-3ce637a2dd52-kube-api-access-zf7bq\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.061554 4802 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f597ab05-4236-4a1c-95cb-3ce637a2dd52-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.066776 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51562a38-c06f-470d-9224-ea64998f81c6-ceph" (OuterVolumeSpecName: "ceph") pod "51562a38-c06f-470d-9224-ea64998f81c6" (UID: "51562a38-c06f-470d-9224-ea64998f81c6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.068211 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "51562a38-c06f-470d-9224-ea64998f81c6" (UID: "51562a38-c06f-470d-9224-ea64998f81c6"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.081671 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.927687345 podStartE2EDuration="5.08165397s" podCreationTimestamp="2025-12-01 20:49:08 +0000 UTC" firstStartedPulling="2025-12-01 20:49:09.262156682 +0000 UTC m=+3170.824716323" lastFinishedPulling="2025-12-01 20:49:11.416123307 +0000 UTC m=+3172.978682948" observedRunningTime="2025-12-01 20:49:13.069390406 +0000 UTC m=+3174.631950067" watchObservedRunningTime="2025-12-01 20:49:13.08165397 +0000 UTC m=+3174.644213611" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.085402 4802 scope.go:117] "RemoveContainer" containerID="38db279e8eef8a3b10c4c4715caae51809492ab6dab7bcda476d06df24fda29f" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.086250 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51562a38-c06f-470d-9224-ea64998f81c6-kube-api-access-jnrb8" (OuterVolumeSpecName: "kube-api-access-jnrb8") pod "51562a38-c06f-470d-9224-ea64998f81c6" (UID: "51562a38-c06f-470d-9224-ea64998f81c6"). InnerVolumeSpecName "kube-api-access-jnrb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.086807 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51562a38-c06f-470d-9224-ea64998f81c6-scripts" (OuterVolumeSpecName: "scripts") pod "51562a38-c06f-470d-9224-ea64998f81c6" (UID: "51562a38-c06f-470d-9224-ea64998f81c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.111463 4802 scope.go:117] "RemoveContainer" containerID="5b81fa39ee48452b2079418353cfa5e180018150ee65d85925e011430a1eaa73" Dec 01 20:49:13 crc kubenswrapper[4802]: E1201 20:49:13.111925 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b81fa39ee48452b2079418353cfa5e180018150ee65d85925e011430a1eaa73\": container with ID starting with 5b81fa39ee48452b2079418353cfa5e180018150ee65d85925e011430a1eaa73 not found: ID does not exist" containerID="5b81fa39ee48452b2079418353cfa5e180018150ee65d85925e011430a1eaa73" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.111956 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b81fa39ee48452b2079418353cfa5e180018150ee65d85925e011430a1eaa73"} err="failed to get container status \"5b81fa39ee48452b2079418353cfa5e180018150ee65d85925e011430a1eaa73\": rpc error: code = NotFound desc = could not find container \"5b81fa39ee48452b2079418353cfa5e180018150ee65d85925e011430a1eaa73\": container with ID starting with 5b81fa39ee48452b2079418353cfa5e180018150ee65d85925e011430a1eaa73 not found: ID does not exist" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.111978 4802 scope.go:117] "RemoveContainer" containerID="38db279e8eef8a3b10c4c4715caae51809492ab6dab7bcda476d06df24fda29f" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.111997 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.111976312 podStartE2EDuration="5.111976312s" podCreationTimestamp="2025-12-01 20:49:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:49:13.101855072 +0000 UTC m=+3174.664414713" watchObservedRunningTime="2025-12-01 20:49:13.111976312 +0000 UTC m=+3174.674535953" Dec 01 20:49:13 crc kubenswrapper[4802]: E1201 20:49:13.112259 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38db279e8eef8a3b10c4c4715caae51809492ab6dab7bcda476d06df24fda29f\": container with ID starting with 38db279e8eef8a3b10c4c4715caae51809492ab6dab7bcda476d06df24fda29f not found: ID does not exist" containerID="38db279e8eef8a3b10c4c4715caae51809492ab6dab7bcda476d06df24fda29f" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.112287 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38db279e8eef8a3b10c4c4715caae51809492ab6dab7bcda476d06df24fda29f"} err="failed to get container status \"38db279e8eef8a3b10c4c4715caae51809492ab6dab7bcda476d06df24fda29f\": rpc error: code = NotFound desc = could not find container \"38db279e8eef8a3b10c4c4715caae51809492ab6dab7bcda476d06df24fda29f\": container with ID starting with 38db279e8eef8a3b10c4c4715caae51809492ab6dab7bcda476d06df24fda29f not found: ID does not exist" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.112306 4802 scope.go:117] "RemoveContainer" containerID="5b81fa39ee48452b2079418353cfa5e180018150ee65d85925e011430a1eaa73" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.112594 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b81fa39ee48452b2079418353cfa5e180018150ee65d85925e011430a1eaa73"} err="failed to get container status \"5b81fa39ee48452b2079418353cfa5e180018150ee65d85925e011430a1eaa73\": rpc error: code = NotFound desc = could not find container \"5b81fa39ee48452b2079418353cfa5e180018150ee65d85925e011430a1eaa73\": container with ID starting with 5b81fa39ee48452b2079418353cfa5e180018150ee65d85925e011430a1eaa73 not found: ID does not exist" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.112612 4802 scope.go:117] "RemoveContainer" containerID="38db279e8eef8a3b10c4c4715caae51809492ab6dab7bcda476d06df24fda29f" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.112819 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38db279e8eef8a3b10c4c4715caae51809492ab6dab7bcda476d06df24fda29f"} err="failed to get container status \"38db279e8eef8a3b10c4c4715caae51809492ab6dab7bcda476d06df24fda29f\": rpc error: code = NotFound desc = could not find container \"38db279e8eef8a3b10c4c4715caae51809492ab6dab7bcda476d06df24fda29f\": container with ID starting with 38db279e8eef8a3b10c4c4715caae51809492ab6dab7bcda476d06df24fda29f not found: ID does not exist" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.121345 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51562a38-c06f-470d-9224-ea64998f81c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51562a38-c06f-470d-9224-ea64998f81c6" (UID: "51562a38-c06f-470d-9224-ea64998f81c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.138964 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51562a38-c06f-470d-9224-ea64998f81c6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "51562a38-c06f-470d-9224-ea64998f81c6" (UID: "51562a38-c06f-470d-9224-ea64998f81c6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.153752 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51562a38-c06f-470d-9224-ea64998f81c6-config-data" (OuterVolumeSpecName: "config-data") pod "51562a38-c06f-470d-9224-ea64998f81c6" (UID: "51562a38-c06f-470d-9224-ea64998f81c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.166889 4802 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51562a38-c06f-470d-9224-ea64998f81c6-logs\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.167643 4802 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51562a38-c06f-470d-9224-ea64998f81c6-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.167656 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51562a38-c06f-470d-9224-ea64998f81c6-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.167676 4802 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.167686 4802 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/51562a38-c06f-470d-9224-ea64998f81c6-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.167694 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51562a38-c06f-470d-9224-ea64998f81c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.167706 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnrb8\" (UniqueName: \"kubernetes.io/projected/51562a38-c06f-470d-9224-ea64998f81c6-kube-api-access-jnrb8\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.167715 4802 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/51562a38-c06f-470d-9224-ea64998f81c6-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.167723 4802 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/51562a38-c06f-470d-9224-ea64998f81c6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.186069 4802 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.269407 4802 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.356468 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.372807 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.427059 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.438894 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.449919 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 20:49:13 crc kubenswrapper[4802]: E1201 20:49:13.451990 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f597ab05-4236-4a1c-95cb-3ce637a2dd52" containerName="mariadb-account-create-update" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.452020 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="f597ab05-4236-4a1c-95cb-3ce637a2dd52" containerName="mariadb-account-create-update" Dec 01 20:49:13 crc kubenswrapper[4802]: E1201 20:49:13.452036 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a66b5e77-5e95-4fd0-957c-9d57f62a2238" containerName="mariadb-database-create" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.452054 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="a66b5e77-5e95-4fd0-957c-9d57f62a2238" containerName="mariadb-database-create" Dec 01 20:49:13 crc kubenswrapper[4802]: E1201 20:49:13.452076 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51562a38-c06f-470d-9224-ea64998f81c6" containerName="glance-httpd" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.452084 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="51562a38-c06f-470d-9224-ea64998f81c6" containerName="glance-httpd" Dec 01 20:49:13 crc kubenswrapper[4802]: E1201 20:49:13.452107 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51562a38-c06f-470d-9224-ea64998f81c6" containerName="glance-log" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.452114 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="51562a38-c06f-470d-9224-ea64998f81c6" containerName="glance-log" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.452330 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="a66b5e77-5e95-4fd0-957c-9d57f62a2238" containerName="mariadb-database-create" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.452345 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="51562a38-c06f-470d-9224-ea64998f81c6" containerName="glance-log" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.452361 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="51562a38-c06f-470d-9224-ea64998f81c6" containerName="glance-httpd" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.452371 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="f597ab05-4236-4a1c-95cb-3ce637a2dd52" containerName="mariadb-account-create-update" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.454966 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.456812 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.459565 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.470989 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.574364 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.574416 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4szf\" (UniqueName: \"kubernetes.io/projected/fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae-kube-api-access-v4szf\") pod \"glance-default-internal-api-0\" (UID: \"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.574447 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.574498 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.574709 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.574760 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae-ceph\") pod \"glance-default-internal-api-0\" (UID: \"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.574794 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae-logs\") pod \"glance-default-internal-api-0\" (UID: \"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.574831 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.574846 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.676826 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.677214 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4szf\" (UniqueName: \"kubernetes.io/projected/fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae-kube-api-access-v4szf\") pod \"glance-default-internal-api-0\" (UID: \"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.677249 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.677270 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.677298 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.677339 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae-ceph\") pod \"glance-default-internal-api-0\" (UID: \"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.677371 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae-logs\") pod \"glance-default-internal-api-0\" (UID: \"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.677409 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.677427 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.690426 4802 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.698162 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.700670 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae-logs\") pod \"glance-default-internal-api-0\" (UID: \"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.703441 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.704093 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.704435 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.704990 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae-ceph\") pod \"glance-default-internal-api-0\" (UID: \"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.714456 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4szf\" (UniqueName: \"kubernetes.io/projected/fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae-kube-api-access-v4szf\") pod \"glance-default-internal-api-0\" (UID: \"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.714969 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: E1201 20:49:13.720115 4802 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51562a38_c06f_470d_9224_ea64998f81c6.slice/crio-b47b399a8bf2dc6514724a656b7c15a1e308f8b27e9cd06ce440e81c971da219\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51562a38_c06f_470d_9224_ea64998f81c6.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb76e652f_2feb_41c9_b8f6_16efc7080094.slice/crio-conmon-d4d077dbe36470e14b21786447c24eebd47a54009b16645e38737b25b277aa24.scope\": RecentStats: unable to find data in memory cache]" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.728523 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae\") " pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.783317 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.814012 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.983636 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b76e652f-2feb-41c9-b8f6-16efc7080094-logs\") pod \"b76e652f-2feb-41c9-b8f6-16efc7080094\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.984018 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b76e652f-2feb-41c9-b8f6-16efc7080094-httpd-run\") pod \"b76e652f-2feb-41c9-b8f6-16efc7080094\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.984079 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b76e652f-2feb-41c9-b8f6-16efc7080094-combined-ca-bundle\") pod \"b76e652f-2feb-41c9-b8f6-16efc7080094\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.984118 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b76e652f-2feb-41c9-b8f6-16efc7080094-config-data\") pod \"b76e652f-2feb-41c9-b8f6-16efc7080094\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.984150 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"b76e652f-2feb-41c9-b8f6-16efc7080094\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.984253 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b76e652f-2feb-41c9-b8f6-16efc7080094-ceph\") pod \"b76e652f-2feb-41c9-b8f6-16efc7080094\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.984304 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqqt8\" (UniqueName: \"kubernetes.io/projected/b76e652f-2feb-41c9-b8f6-16efc7080094-kube-api-access-gqqt8\") pod \"b76e652f-2feb-41c9-b8f6-16efc7080094\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.984346 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b76e652f-2feb-41c9-b8f6-16efc7080094-public-tls-certs\") pod \"b76e652f-2feb-41c9-b8f6-16efc7080094\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.984384 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b76e652f-2feb-41c9-b8f6-16efc7080094-scripts\") pod \"b76e652f-2feb-41c9-b8f6-16efc7080094\" (UID: \"b76e652f-2feb-41c9-b8f6-16efc7080094\") " Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.993805 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b76e652f-2feb-41c9-b8f6-16efc7080094-ceph" (OuterVolumeSpecName: "ceph") pod "b76e652f-2feb-41c9-b8f6-16efc7080094" (UID: "b76e652f-2feb-41c9-b8f6-16efc7080094"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.994131 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b76e652f-2feb-41c9-b8f6-16efc7080094-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b76e652f-2feb-41c9-b8f6-16efc7080094" (UID: "b76e652f-2feb-41c9-b8f6-16efc7080094"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.994380 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b76e652f-2feb-41c9-b8f6-16efc7080094-logs" (OuterVolumeSpecName: "logs") pod "b76e652f-2feb-41c9-b8f6-16efc7080094" (UID: "b76e652f-2feb-41c9-b8f6-16efc7080094"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.994511 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b76e652f-2feb-41c9-b8f6-16efc7080094-kube-api-access-gqqt8" (OuterVolumeSpecName: "kube-api-access-gqqt8") pod "b76e652f-2feb-41c9-b8f6-16efc7080094" (UID: "b76e652f-2feb-41c9-b8f6-16efc7080094"). InnerVolumeSpecName "kube-api-access-gqqt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.996921 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b76e652f-2feb-41c9-b8f6-16efc7080094-scripts" (OuterVolumeSpecName: "scripts") pod "b76e652f-2feb-41c9-b8f6-16efc7080094" (UID: "b76e652f-2feb-41c9-b8f6-16efc7080094"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:49:13 crc kubenswrapper[4802]: I1201 20:49:13.997861 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "b76e652f-2feb-41c9-b8f6-16efc7080094" (UID: "b76e652f-2feb-41c9-b8f6-16efc7080094"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.033393 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b76e652f-2feb-41c9-b8f6-16efc7080094-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b76e652f-2feb-41c9-b8f6-16efc7080094" (UID: "b76e652f-2feb-41c9-b8f6-16efc7080094"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.055319 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b76e652f-2feb-41c9-b8f6-16efc7080094-config-data" (OuterVolumeSpecName: "config-data") pod "b76e652f-2feb-41c9-b8f6-16efc7080094" (UID: "b76e652f-2feb-41c9-b8f6-16efc7080094"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.080063 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b76e652f-2feb-41c9-b8f6-16efc7080094-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b76e652f-2feb-41c9-b8f6-16efc7080094" (UID: "b76e652f-2feb-41c9-b8f6-16efc7080094"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.081390 4802 generic.go:334] "Generic (PLEG): container finished" podID="b76e652f-2feb-41c9-b8f6-16efc7080094" containerID="d4d077dbe36470e14b21786447c24eebd47a54009b16645e38737b25b277aa24" exitCode=0 Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.081420 4802 generic.go:334] "Generic (PLEG): container finished" podID="b76e652f-2feb-41c9-b8f6-16efc7080094" containerID="62fc542631130970b258aea9585d746749de032ae92eaf48182acbe39a655b11" exitCode=143 Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.081528 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.081617 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b76e652f-2feb-41c9-b8f6-16efc7080094","Type":"ContainerDied","Data":"d4d077dbe36470e14b21786447c24eebd47a54009b16645e38737b25b277aa24"} Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.081673 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b76e652f-2feb-41c9-b8f6-16efc7080094","Type":"ContainerDied","Data":"62fc542631130970b258aea9585d746749de032ae92eaf48182acbe39a655b11"} Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.081692 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b76e652f-2feb-41c9-b8f6-16efc7080094","Type":"ContainerDied","Data":"55c8da4ff5209bc5be588b7f5bba16de02edc5da5e8c972cad44f6d8875fb661"} Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.081712 4802 scope.go:117] "RemoveContainer" containerID="d4d077dbe36470e14b21786447c24eebd47a54009b16645e38737b25b277aa24" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.087581 4802 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b76e652f-2feb-41c9-b8f6-16efc7080094-logs\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.087611 4802 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b76e652f-2feb-41c9-b8f6-16efc7080094-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.087624 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b76e652f-2feb-41c9-b8f6-16efc7080094-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.087636 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b76e652f-2feb-41c9-b8f6-16efc7080094-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.087672 4802 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.087683 4802 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b76e652f-2feb-41c9-b8f6-16efc7080094-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.087694 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqqt8\" (UniqueName: \"kubernetes.io/projected/b76e652f-2feb-41c9-b8f6-16efc7080094-kube-api-access-gqqt8\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.087705 4802 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b76e652f-2feb-41c9-b8f6-16efc7080094-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.087715 4802 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b76e652f-2feb-41c9-b8f6-16efc7080094-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.119327 4802 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.140443 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.164675 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.166991 4802 scope.go:117] "RemoveContainer" containerID="62fc542631130970b258aea9585d746749de032ae92eaf48182acbe39a655b11" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.174566 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 20:49:14 crc kubenswrapper[4802]: E1201 20:49:14.175072 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76e652f-2feb-41c9-b8f6-16efc7080094" containerName="glance-httpd" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.175092 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76e652f-2feb-41c9-b8f6-16efc7080094" containerName="glance-httpd" Dec 01 20:49:14 crc kubenswrapper[4802]: E1201 20:49:14.175119 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76e652f-2feb-41c9-b8f6-16efc7080094" containerName="glance-log" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.175127 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76e652f-2feb-41c9-b8f6-16efc7080094" containerName="glance-log" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.175377 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="b76e652f-2feb-41c9-b8f6-16efc7080094" containerName="glance-log" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.175404 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="b76e652f-2feb-41c9-b8f6-16efc7080094" containerName="glance-httpd" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.176654 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.178889 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.179777 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.194751 4802 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.196871 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.219981 4802 scope.go:117] "RemoveContainer" containerID="d4d077dbe36470e14b21786447c24eebd47a54009b16645e38737b25b277aa24" Dec 01 20:49:14 crc kubenswrapper[4802]: E1201 20:49:14.220405 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4d077dbe36470e14b21786447c24eebd47a54009b16645e38737b25b277aa24\": container with ID starting with d4d077dbe36470e14b21786447c24eebd47a54009b16645e38737b25b277aa24 not found: ID does not exist" containerID="d4d077dbe36470e14b21786447c24eebd47a54009b16645e38737b25b277aa24" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.220456 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4d077dbe36470e14b21786447c24eebd47a54009b16645e38737b25b277aa24"} err="failed to get container status \"d4d077dbe36470e14b21786447c24eebd47a54009b16645e38737b25b277aa24\": rpc error: code = NotFound desc = could not find container \"d4d077dbe36470e14b21786447c24eebd47a54009b16645e38737b25b277aa24\": container with ID starting with d4d077dbe36470e14b21786447c24eebd47a54009b16645e38737b25b277aa24 not found: ID does not exist" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.220490 4802 scope.go:117] "RemoveContainer" containerID="62fc542631130970b258aea9585d746749de032ae92eaf48182acbe39a655b11" Dec 01 20:49:14 crc kubenswrapper[4802]: E1201 20:49:14.221042 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62fc542631130970b258aea9585d746749de032ae92eaf48182acbe39a655b11\": container with ID starting with 62fc542631130970b258aea9585d746749de032ae92eaf48182acbe39a655b11 not found: ID does not exist" containerID="62fc542631130970b258aea9585d746749de032ae92eaf48182acbe39a655b11" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.221173 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62fc542631130970b258aea9585d746749de032ae92eaf48182acbe39a655b11"} err="failed to get container status \"62fc542631130970b258aea9585d746749de032ae92eaf48182acbe39a655b11\": rpc error: code = NotFound desc = could not find container \"62fc542631130970b258aea9585d746749de032ae92eaf48182acbe39a655b11\": container with ID starting with 62fc542631130970b258aea9585d746749de032ae92eaf48182acbe39a655b11 not found: ID does not exist" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.221268 4802 scope.go:117] "RemoveContainer" containerID="d4d077dbe36470e14b21786447c24eebd47a54009b16645e38737b25b277aa24" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.224041 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4d077dbe36470e14b21786447c24eebd47a54009b16645e38737b25b277aa24"} err="failed to get container status \"d4d077dbe36470e14b21786447c24eebd47a54009b16645e38737b25b277aa24\": rpc error: code = NotFound desc = could not find container \"d4d077dbe36470e14b21786447c24eebd47a54009b16645e38737b25b277aa24\": container with ID starting with d4d077dbe36470e14b21786447c24eebd47a54009b16645e38737b25b277aa24 not found: ID does not exist" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.224105 4802 scope.go:117] "RemoveContainer" containerID="62fc542631130970b258aea9585d746749de032ae92eaf48182acbe39a655b11" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.224640 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62fc542631130970b258aea9585d746749de032ae92eaf48182acbe39a655b11"} err="failed to get container status \"62fc542631130970b258aea9585d746749de032ae92eaf48182acbe39a655b11\": rpc error: code = NotFound desc = could not find container \"62fc542631130970b258aea9585d746749de032ae92eaf48182acbe39a655b11\": container with ID starting with 62fc542631130970b258aea9585d746749de032ae92eaf48182acbe39a655b11 not found: ID does not exist" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.296675 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30bb57c2-94ae-48ad-9e68-0b595b58246b-scripts\") pod \"glance-default-external-api-0\" (UID: \"30bb57c2-94ae-48ad-9e68-0b595b58246b\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.296789 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30bb57c2-94ae-48ad-9e68-0b595b58246b-config-data\") pod \"glance-default-external-api-0\" (UID: \"30bb57c2-94ae-48ad-9e68-0b595b58246b\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.297810 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30bb57c2-94ae-48ad-9e68-0b595b58246b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"30bb57c2-94ae-48ad-9e68-0b595b58246b\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.297851 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30bb57c2-94ae-48ad-9e68-0b595b58246b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"30bb57c2-94ae-48ad-9e68-0b595b58246b\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.298084 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5bpm\" (UniqueName: \"kubernetes.io/projected/30bb57c2-94ae-48ad-9e68-0b595b58246b-kube-api-access-d5bpm\") pod \"glance-default-external-api-0\" (UID: \"30bb57c2-94ae-48ad-9e68-0b595b58246b\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.298117 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30bb57c2-94ae-48ad-9e68-0b595b58246b-logs\") pod \"glance-default-external-api-0\" (UID: \"30bb57c2-94ae-48ad-9e68-0b595b58246b\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.298164 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/30bb57c2-94ae-48ad-9e68-0b595b58246b-ceph\") pod \"glance-default-external-api-0\" (UID: \"30bb57c2-94ae-48ad-9e68-0b595b58246b\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.298290 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"30bb57c2-94ae-48ad-9e68-0b595b58246b\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.298319 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/30bb57c2-94ae-48ad-9e68-0b595b58246b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"30bb57c2-94ae-48ad-9e68-0b595b58246b\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.400595 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30bb57c2-94ae-48ad-9e68-0b595b58246b-scripts\") pod \"glance-default-external-api-0\" (UID: \"30bb57c2-94ae-48ad-9e68-0b595b58246b\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.400645 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30bb57c2-94ae-48ad-9e68-0b595b58246b-config-data\") pod \"glance-default-external-api-0\" (UID: \"30bb57c2-94ae-48ad-9e68-0b595b58246b\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.400691 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30bb57c2-94ae-48ad-9e68-0b595b58246b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"30bb57c2-94ae-48ad-9e68-0b595b58246b\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.400716 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30bb57c2-94ae-48ad-9e68-0b595b58246b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"30bb57c2-94ae-48ad-9e68-0b595b58246b\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.400785 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5bpm\" (UniqueName: \"kubernetes.io/projected/30bb57c2-94ae-48ad-9e68-0b595b58246b-kube-api-access-d5bpm\") pod \"glance-default-external-api-0\" (UID: \"30bb57c2-94ae-48ad-9e68-0b595b58246b\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.400811 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30bb57c2-94ae-48ad-9e68-0b595b58246b-logs\") pod \"glance-default-external-api-0\" (UID: \"30bb57c2-94ae-48ad-9e68-0b595b58246b\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.400844 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/30bb57c2-94ae-48ad-9e68-0b595b58246b-ceph\") pod \"glance-default-external-api-0\" (UID: \"30bb57c2-94ae-48ad-9e68-0b595b58246b\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.400904 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"30bb57c2-94ae-48ad-9e68-0b595b58246b\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.400924 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/30bb57c2-94ae-48ad-9e68-0b595b58246b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"30bb57c2-94ae-48ad-9e68-0b595b58246b\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.401332 4802 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"30bb57c2-94ae-48ad-9e68-0b595b58246b\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.401509 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/30bb57c2-94ae-48ad-9e68-0b595b58246b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"30bb57c2-94ae-48ad-9e68-0b595b58246b\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.401701 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30bb57c2-94ae-48ad-9e68-0b595b58246b-logs\") pod \"glance-default-external-api-0\" (UID: \"30bb57c2-94ae-48ad-9e68-0b595b58246b\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.407430 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30bb57c2-94ae-48ad-9e68-0b595b58246b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"30bb57c2-94ae-48ad-9e68-0b595b58246b\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.407441 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30bb57c2-94ae-48ad-9e68-0b595b58246b-scripts\") pod \"glance-default-external-api-0\" (UID: \"30bb57c2-94ae-48ad-9e68-0b595b58246b\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.409664 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30bb57c2-94ae-48ad-9e68-0b595b58246b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"30bb57c2-94ae-48ad-9e68-0b595b58246b\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.411077 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/30bb57c2-94ae-48ad-9e68-0b595b58246b-ceph\") pod \"glance-default-external-api-0\" (UID: \"30bb57c2-94ae-48ad-9e68-0b595b58246b\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.425982 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30bb57c2-94ae-48ad-9e68-0b595b58246b-config-data\") pod \"glance-default-external-api-0\" (UID: \"30bb57c2-94ae-48ad-9e68-0b595b58246b\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.433805 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5bpm\" (UniqueName: \"kubernetes.io/projected/30bb57c2-94ae-48ad-9e68-0b595b58246b-kube-api-access-d5bpm\") pod \"glance-default-external-api-0\" (UID: \"30bb57c2-94ae-48ad-9e68-0b595b58246b\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.443169 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"30bb57c2-94ae-48ad-9e68-0b595b58246b\") " pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.549510 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.597273 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.736153 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51562a38-c06f-470d-9224-ea64998f81c6" path="/var/lib/kubelet/pods/51562a38-c06f-470d-9224-ea64998f81c6/volumes" Dec 01 20:49:14 crc kubenswrapper[4802]: I1201 20:49:14.737297 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b76e652f-2feb-41c9-b8f6-16efc7080094" path="/var/lib/kubelet/pods/b76e652f-2feb-41c9-b8f6-16efc7080094/volumes" Dec 01 20:49:15 crc kubenswrapper[4802]: I1201 20:49:15.094938 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae","Type":"ContainerStarted","Data":"0940588b123384f84a72a51f0ba2b79e0911c849ca2885cb0142275c17e727ee"} Dec 01 20:49:15 crc kubenswrapper[4802]: I1201 20:49:15.267633 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 20:49:16 crc kubenswrapper[4802]: I1201 20:49:16.112176 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"30bb57c2-94ae-48ad-9e68-0b595b58246b","Type":"ContainerStarted","Data":"8d0f3be4b2d7f685da84602492e0a433c8ea240d038467048cb5ca41673f0d3e"} Dec 01 20:49:16 crc kubenswrapper[4802]: I1201 20:49:16.112893 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"30bb57c2-94ae-48ad-9e68-0b595b58246b","Type":"ContainerStarted","Data":"4fc2a1a26a648537cb5dddefec4af119277c8984d16f02c6cf6319585019026f"} Dec 01 20:49:16 crc kubenswrapper[4802]: I1201 20:49:16.119224 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae","Type":"ContainerStarted","Data":"e2e4f42b606ab66eb489ac9ae55a5017bd1bd54a9cca360d9e8a6bf1740436fb"} Dec 01 20:49:17 crc kubenswrapper[4802]: I1201 20:49:17.132146 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae","Type":"ContainerStarted","Data":"aecacbbfb201ec3429cdcac49bf79e9739ebc0a708db988939e7d0f58b98630d"} Dec 01 20:49:17 crc kubenswrapper[4802]: I1201 20:49:17.134878 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"30bb57c2-94ae-48ad-9e68-0b595b58246b","Type":"ContainerStarted","Data":"321c4e9d09ce3072779d32b69bda1c4397cb79130aaf4aaa35105b86464a2e61"} Dec 01 20:49:17 crc kubenswrapper[4802]: I1201 20:49:17.165528 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.165507145 podStartE2EDuration="4.165507145s" podCreationTimestamp="2025-12-01 20:49:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:49:17.153353813 +0000 UTC m=+3178.715913454" watchObservedRunningTime="2025-12-01 20:49:17.165507145 +0000 UTC m=+3178.728066786" Dec 01 20:49:17 crc kubenswrapper[4802]: I1201 20:49:17.177988 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.177969206 podStartE2EDuration="3.177969206s" podCreationTimestamp="2025-12-01 20:49:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:49:17.174862013 +0000 UTC m=+3178.737421664" watchObservedRunningTime="2025-12-01 20:49:17.177969206 +0000 UTC m=+3178.740528847" Dec 01 20:49:18 crc kubenswrapper[4802]: I1201 20:49:18.577369 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Dec 01 20:49:18 crc kubenswrapper[4802]: I1201 20:49:18.588861 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Dec 01 20:49:19 crc kubenswrapper[4802]: I1201 20:49:19.027980 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-bms4z"] Dec 01 20:49:19 crc kubenswrapper[4802]: I1201 20:49:19.029748 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-bms4z" Dec 01 20:49:19 crc kubenswrapper[4802]: I1201 20:49:19.032254 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 01 20:49:19 crc kubenswrapper[4802]: I1201 20:49:19.032767 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-dnqgf" Dec 01 20:49:19 crc kubenswrapper[4802]: I1201 20:49:19.037534 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-bms4z"] Dec 01 20:49:19 crc kubenswrapper[4802]: I1201 20:49:19.126859 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c-job-config-data\") pod \"manila-db-sync-bms4z\" (UID: \"402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c\") " pod="openstack/manila-db-sync-bms4z" Dec 01 20:49:19 crc kubenswrapper[4802]: I1201 20:49:19.126920 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c-config-data\") pod \"manila-db-sync-bms4z\" (UID: \"402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c\") " pod="openstack/manila-db-sync-bms4z" Dec 01 20:49:19 crc kubenswrapper[4802]: I1201 20:49:19.127001 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c-combined-ca-bundle\") pod \"manila-db-sync-bms4z\" (UID: \"402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c\") " pod="openstack/manila-db-sync-bms4z" Dec 01 20:49:19 crc kubenswrapper[4802]: I1201 20:49:19.127042 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72mpp\" (UniqueName: \"kubernetes.io/projected/402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c-kube-api-access-72mpp\") pod \"manila-db-sync-bms4z\" (UID: \"402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c\") " pod="openstack/manila-db-sync-bms4z" Dec 01 20:49:19 crc kubenswrapper[4802]: I1201 20:49:19.229456 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c-combined-ca-bundle\") pod \"manila-db-sync-bms4z\" (UID: \"402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c\") " pod="openstack/manila-db-sync-bms4z" Dec 01 20:49:19 crc kubenswrapper[4802]: I1201 20:49:19.229529 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72mpp\" (UniqueName: \"kubernetes.io/projected/402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c-kube-api-access-72mpp\") pod \"manila-db-sync-bms4z\" (UID: \"402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c\") " pod="openstack/manila-db-sync-bms4z" Dec 01 20:49:19 crc kubenswrapper[4802]: I1201 20:49:19.229661 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c-job-config-data\") pod \"manila-db-sync-bms4z\" (UID: \"402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c\") " pod="openstack/manila-db-sync-bms4z" Dec 01 20:49:19 crc kubenswrapper[4802]: I1201 20:49:19.229696 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c-config-data\") pod \"manila-db-sync-bms4z\" (UID: \"402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c\") " pod="openstack/manila-db-sync-bms4z" Dec 01 20:49:19 crc kubenswrapper[4802]: I1201 20:49:19.234953 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c-job-config-data\") pod \"manila-db-sync-bms4z\" (UID: \"402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c\") " pod="openstack/manila-db-sync-bms4z" Dec 01 20:49:19 crc kubenswrapper[4802]: I1201 20:49:19.236470 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c-config-data\") pod \"manila-db-sync-bms4z\" (UID: \"402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c\") " pod="openstack/manila-db-sync-bms4z" Dec 01 20:49:19 crc kubenswrapper[4802]: I1201 20:49:19.237099 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c-combined-ca-bundle\") pod \"manila-db-sync-bms4z\" (UID: \"402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c\") " pod="openstack/manila-db-sync-bms4z" Dec 01 20:49:19 crc kubenswrapper[4802]: I1201 20:49:19.248835 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72mpp\" (UniqueName: \"kubernetes.io/projected/402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c-kube-api-access-72mpp\") pod \"manila-db-sync-bms4z\" (UID: \"402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c\") " pod="openstack/manila-db-sync-bms4z" Dec 01 20:49:19 crc kubenswrapper[4802]: I1201 20:49:19.364818 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-bms4z" Dec 01 20:49:22 crc kubenswrapper[4802]: I1201 20:49:22.511419 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-bms4z"] Dec 01 20:49:22 crc kubenswrapper[4802]: W1201 20:49:22.525795 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod402ca7c9_4910_43ff_b2f0_0e1c3d35ff7c.slice/crio-bfa795414201275efe2e46cc95e0d41052e8425c0fc1a4307a651f4d4774261d WatchSource:0}: Error finding container bfa795414201275efe2e46cc95e0d41052e8425c0fc1a4307a651f4d4774261d: Status 404 returned error can't find the container with id bfa795414201275efe2e46cc95e0d41052e8425c0fc1a4307a651f4d4774261d Dec 01 20:49:22 crc kubenswrapper[4802]: I1201 20:49:22.720095 4802 scope.go:117] "RemoveContainer" containerID="538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098" Dec 01 20:49:22 crc kubenswrapper[4802]: E1201 20:49:22.720640 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:49:23 crc kubenswrapper[4802]: I1201 20:49:23.213743 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-bms4z" event={"ID":"402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c","Type":"ContainerStarted","Data":"bfa795414201275efe2e46cc95e0d41052e8425c0fc1a4307a651f4d4774261d"} Dec 01 20:49:23 crc kubenswrapper[4802]: I1201 20:49:23.216948 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b69f75cb8-xrkks" event={"ID":"40185112-89e4-49c3-9ccc-0b190724c5ff","Type":"ContainerStarted","Data":"fd5392170e744348ccb0fdd3ea01b211684775beb51a5532fa5c901c35e78bde"} Dec 01 20:49:23 crc kubenswrapper[4802]: I1201 20:49:23.217034 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b69f75cb8-xrkks" event={"ID":"40185112-89e4-49c3-9ccc-0b190724c5ff","Type":"ContainerStarted","Data":"2f822bd812314d99e18d3dd9a2a6178ca54e6c20303eead0f4a3a68ef5506803"} Dec 01 20:49:23 crc kubenswrapper[4802]: I1201 20:49:23.219308 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-769b6f4d57-mlfdl" event={"ID":"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9","Type":"ContainerStarted","Data":"53ea3a255cc779eca997cff9d75eac85026808c121724c77d6caf3cd0cdf46d6"} Dec 01 20:49:23 crc kubenswrapper[4802]: I1201 20:49:23.219360 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-769b6f4d57-mlfdl" event={"ID":"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9","Type":"ContainerStarted","Data":"6fc3e7343ae40f36a4591583dc681ac03d10d6e49fed88bf9049f12db037853a"} Dec 01 20:49:23 crc kubenswrapper[4802]: I1201 20:49:23.219350 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-769b6f4d57-mlfdl" podUID="1fdc24d0-b560-47a5-ad7d-d77d7581f7d9" containerName="horizon-log" containerID="cri-o://6fc3e7343ae40f36a4591583dc681ac03d10d6e49fed88bf9049f12db037853a" gracePeriod=30 Dec 01 20:49:23 crc kubenswrapper[4802]: I1201 20:49:23.219418 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-769b6f4d57-mlfdl" podUID="1fdc24d0-b560-47a5-ad7d-d77d7581f7d9" containerName="horizon" containerID="cri-o://53ea3a255cc779eca997cff9d75eac85026808c121724c77d6caf3cd0cdf46d6" gracePeriod=30 Dec 01 20:49:23 crc kubenswrapper[4802]: I1201 20:49:23.242276 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6468fb5467-v7gmh" event={"ID":"c071511c-6a08-4631-b449-92a8a12d69f4","Type":"ContainerStarted","Data":"1aa285c8b297734162708ff008d5e2c123d15db7271330d5cff13f5c9172276c"} Dec 01 20:49:23 crc kubenswrapper[4802]: I1201 20:49:23.242361 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6468fb5467-v7gmh" podUID="c071511c-6a08-4631-b449-92a8a12d69f4" containerName="horizon-log" containerID="cri-o://28b740a183951dbc8b038cecbaf80096e654f5d43f4b08737d83e53fb5fb5012" gracePeriod=30 Dec 01 20:49:23 crc kubenswrapper[4802]: I1201 20:49:23.242481 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6468fb5467-v7gmh" event={"ID":"c071511c-6a08-4631-b449-92a8a12d69f4","Type":"ContainerStarted","Data":"28b740a183951dbc8b038cecbaf80096e654f5d43f4b08737d83e53fb5fb5012"} Dec 01 20:49:23 crc kubenswrapper[4802]: I1201 20:49:23.242382 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6468fb5467-v7gmh" podUID="c071511c-6a08-4631-b449-92a8a12d69f4" containerName="horizon" containerID="cri-o://1aa285c8b297734162708ff008d5e2c123d15db7271330d5cff13f5c9172276c" gracePeriod=30 Dec 01 20:49:23 crc kubenswrapper[4802]: I1201 20:49:23.245209 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58f6ccd776-gr6kp" event={"ID":"f2a61354-70f8-4e95-ab30-6e1d90128879","Type":"ContainerStarted","Data":"232156007ccc94a24dde9aeeb757f17c9242d6806240397fa3b536de5424c9e1"} Dec 01 20:49:23 crc kubenswrapper[4802]: I1201 20:49:23.245270 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58f6ccd776-gr6kp" event={"ID":"f2a61354-70f8-4e95-ab30-6e1d90128879","Type":"ContainerStarted","Data":"de897dcdcc75080e2b98ab343ee6181eade9fc23a8da10a7e641218df7f93229"} Dec 01 20:49:23 crc kubenswrapper[4802]: I1201 20:49:23.263962 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-b69f75cb8-xrkks" podStartSLOduration=2.766901659 podStartE2EDuration="12.263945348s" podCreationTimestamp="2025-12-01 20:49:11 +0000 UTC" firstStartedPulling="2025-12-01 20:49:12.564784393 +0000 UTC m=+3174.127344034" lastFinishedPulling="2025-12-01 20:49:22.061828062 +0000 UTC m=+3183.624387723" observedRunningTime="2025-12-01 20:49:23.261559857 +0000 UTC m=+3184.824119498" watchObservedRunningTime="2025-12-01 20:49:23.263945348 +0000 UTC m=+3184.826504979" Dec 01 20:49:23 crc kubenswrapper[4802]: I1201 20:49:23.287275 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6468fb5467-v7gmh" podStartSLOduration=2.990085311 podStartE2EDuration="15.287252181s" podCreationTimestamp="2025-12-01 20:49:08 +0000 UTC" firstStartedPulling="2025-12-01 20:49:09.863906855 +0000 UTC m=+3171.426466496" lastFinishedPulling="2025-12-01 20:49:22.161073715 +0000 UTC m=+3183.723633366" observedRunningTime="2025-12-01 20:49:23.279077768 +0000 UTC m=+3184.841637509" watchObservedRunningTime="2025-12-01 20:49:23.287252181 +0000 UTC m=+3184.849811822" Dec 01 20:49:23 crc kubenswrapper[4802]: I1201 20:49:23.318832 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-58f6ccd776-gr6kp" podStartSLOduration=2.385634005 podStartE2EDuration="12.318815281s" podCreationTimestamp="2025-12-01 20:49:11 +0000 UTC" firstStartedPulling="2025-12-01 20:49:12.226476377 +0000 UTC m=+3173.789036018" lastFinishedPulling="2025-12-01 20:49:22.159657653 +0000 UTC m=+3183.722217294" observedRunningTime="2025-12-01 20:49:23.306275897 +0000 UTC m=+3184.868835538" watchObservedRunningTime="2025-12-01 20:49:23.318815281 +0000 UTC m=+3184.881374922" Dec 01 20:49:23 crc kubenswrapper[4802]: I1201 20:49:23.341905 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-769b6f4d57-mlfdl" podStartSLOduration=3.181137275 podStartE2EDuration="15.341883546s" podCreationTimestamp="2025-12-01 20:49:08 +0000 UTC" firstStartedPulling="2025-12-01 20:49:09.875549882 +0000 UTC m=+3171.438109523" lastFinishedPulling="2025-12-01 20:49:22.036296103 +0000 UTC m=+3183.598855794" observedRunningTime="2025-12-01 20:49:23.326147129 +0000 UTC m=+3184.888706770" watchObservedRunningTime="2025-12-01 20:49:23.341883546 +0000 UTC m=+3184.904443187" Dec 01 20:49:23 crc kubenswrapper[4802]: I1201 20:49:23.785261 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 20:49:23 crc kubenswrapper[4802]: I1201 20:49:23.785608 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 20:49:23 crc kubenswrapper[4802]: I1201 20:49:23.819722 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 20:49:23 crc kubenswrapper[4802]: I1201 20:49:23.832349 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 20:49:24 crc kubenswrapper[4802]: I1201 20:49:24.256001 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 20:49:24 crc kubenswrapper[4802]: I1201 20:49:24.256047 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 20:49:24 crc kubenswrapper[4802]: I1201 20:49:24.597728 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 20:49:24 crc kubenswrapper[4802]: I1201 20:49:24.598043 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 20:49:24 crc kubenswrapper[4802]: I1201 20:49:24.638209 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 20:49:24 crc kubenswrapper[4802]: I1201 20:49:24.639691 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 20:49:25 crc kubenswrapper[4802]: I1201 20:49:25.266630 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 20:49:25 crc kubenswrapper[4802]: I1201 20:49:25.266947 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 20:49:26 crc kubenswrapper[4802]: I1201 20:49:26.531778 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 20:49:26 crc kubenswrapper[4802]: I1201 20:49:26.532230 4802 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 20:49:26 crc kubenswrapper[4802]: I1201 20:49:26.533914 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 20:49:27 crc kubenswrapper[4802]: I1201 20:49:27.602001 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 20:49:27 crc kubenswrapper[4802]: I1201 20:49:27.602117 4802 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 20:49:27 crc kubenswrapper[4802]: I1201 20:49:27.603463 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 20:49:29 crc kubenswrapper[4802]: I1201 20:49:29.126467 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6468fb5467-v7gmh" Dec 01 20:49:29 crc kubenswrapper[4802]: I1201 20:49:29.201736 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-769b6f4d57-mlfdl" Dec 01 20:49:29 crc kubenswrapper[4802]: I1201 20:49:29.315746 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-bms4z" event={"ID":"402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c","Type":"ContainerStarted","Data":"dc420b054e47fa4ef46ad49e178b81450b5981ddd17e3ae49488b58443937c91"} Dec 01 20:49:29 crc kubenswrapper[4802]: I1201 20:49:29.352263 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-bms4z" podStartSLOduration=5.252960194 podStartE2EDuration="10.352241789s" podCreationTimestamp="2025-12-01 20:49:19 +0000 UTC" firstStartedPulling="2025-12-01 20:49:22.52860637 +0000 UTC m=+3184.091166011" lastFinishedPulling="2025-12-01 20:49:27.627887975 +0000 UTC m=+3189.190447606" observedRunningTime="2025-12-01 20:49:29.338350685 +0000 UTC m=+3190.900910326" watchObservedRunningTime="2025-12-01 20:49:29.352241789 +0000 UTC m=+3190.914801440" Dec 01 20:49:31 crc kubenswrapper[4802]: I1201 20:49:31.624654 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-b69f75cb8-xrkks" Dec 01 20:49:31 crc kubenswrapper[4802]: I1201 20:49:31.624897 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-b69f75cb8-xrkks" Dec 01 20:49:31 crc kubenswrapper[4802]: I1201 20:49:31.638783 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:49:31 crc kubenswrapper[4802]: I1201 20:49:31.638824 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:49:37 crc kubenswrapper[4802]: I1201 20:49:37.719981 4802 scope.go:117] "RemoveContainer" containerID="538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098" Dec 01 20:49:37 crc kubenswrapper[4802]: E1201 20:49:37.721409 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:49:39 crc kubenswrapper[4802]: I1201 20:49:39.447254 4802 generic.go:334] "Generic (PLEG): container finished" podID="402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c" containerID="dc420b054e47fa4ef46ad49e178b81450b5981ddd17e3ae49488b58443937c91" exitCode=0 Dec 01 20:49:39 crc kubenswrapper[4802]: I1201 20:49:39.447337 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-bms4z" event={"ID":"402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c","Type":"ContainerDied","Data":"dc420b054e47fa4ef46ad49e178b81450b5981ddd17e3ae49488b58443937c91"} Dec 01 20:49:40 crc kubenswrapper[4802]: I1201 20:49:40.951574 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-bms4z" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.060666 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c-combined-ca-bundle\") pod \"402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c\" (UID: \"402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c\") " Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.060765 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72mpp\" (UniqueName: \"kubernetes.io/projected/402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c-kube-api-access-72mpp\") pod \"402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c\" (UID: \"402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c\") " Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.060966 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c-config-data\") pod \"402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c\" (UID: \"402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c\") " Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.061048 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c-job-config-data\") pod \"402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c\" (UID: \"402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c\") " Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.078504 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c-kube-api-access-72mpp" (OuterVolumeSpecName: "kube-api-access-72mpp") pod "402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c" (UID: "402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c"). InnerVolumeSpecName "kube-api-access-72mpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.080707 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c" (UID: "402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.082176 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c-config-data" (OuterVolumeSpecName: "config-data") pod "402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c" (UID: "402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.089274 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c" (UID: "402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.163029 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.163065 4802 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c-job-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.163076 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.163085 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72mpp\" (UniqueName: \"kubernetes.io/projected/402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c-kube-api-access-72mpp\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.479835 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-bms4z" event={"ID":"402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c","Type":"ContainerDied","Data":"bfa795414201275efe2e46cc95e0d41052e8425c0fc1a4307a651f4d4774261d"} Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.480224 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfa795414201275efe2e46cc95e0d41052e8425c0fc1a4307a651f4d4774261d" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.480350 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-bms4z" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.635302 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-b69f75cb8-xrkks" podUID="40185112-89e4-49c3-9ccc-0b190724c5ff" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.240:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.240:8443: connect: connection refused" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.639744 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-58f6ccd776-gr6kp" podUID="f2a61354-70f8-4e95-ab30-6e1d90128879" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.239:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.239:8443: connect: connection refused" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.779572 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Dec 01 20:49:41 crc kubenswrapper[4802]: E1201 20:49:41.779977 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c" containerName="manila-db-sync" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.779994 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c" containerName="manila-db-sync" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.780213 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c" containerName="manila-db-sync" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.781239 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.783800 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-dnqgf" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.788871 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.789074 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.789224 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.791472 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.793496 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.797201 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.809016 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.820715 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.882345 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " pod="openstack/manila-share-share1-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.882625 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c800d9a-2aa1-49df-a30b-23f36117381b-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"4c800d9a-2aa1-49df-a30b-23f36117381b\") " pod="openstack/manila-scheduler-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.883168 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " pod="openstack/manila-share-share1-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.883295 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " pod="openstack/manila-share-share1-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.883431 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c800d9a-2aa1-49df-a30b-23f36117381b-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"4c800d9a-2aa1-49df-a30b-23f36117381b\") " pod="openstack/manila-scheduler-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.883505 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " pod="openstack/manila-share-share1-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.883642 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c800d9a-2aa1-49df-a30b-23f36117381b-scripts\") pod \"manila-scheduler-0\" (UID: \"4c800d9a-2aa1-49df-a30b-23f36117381b\") " pod="openstack/manila-scheduler-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.883727 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-scripts\") pod \"manila-share-share1-0\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " pod="openstack/manila-share-share1-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.883835 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4c800d9a-2aa1-49df-a30b-23f36117381b-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"4c800d9a-2aa1-49df-a30b-23f36117381b\") " pod="openstack/manila-scheduler-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.884945 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-config-data\") pod \"manila-share-share1-0\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " pod="openstack/manila-share-share1-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.885066 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfwjl\" (UniqueName: \"kubernetes.io/projected/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-kube-api-access-tfwjl\") pod \"manila-share-share1-0\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " pod="openstack/manila-share-share1-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.885144 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rknr\" (UniqueName: \"kubernetes.io/projected/4c800d9a-2aa1-49df-a30b-23f36117381b-kube-api-access-4rknr\") pod \"manila-scheduler-0\" (UID: \"4c800d9a-2aa1-49df-a30b-23f36117381b\") " pod="openstack/manila-scheduler-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.885292 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c800d9a-2aa1-49df-a30b-23f36117381b-config-data\") pod \"manila-scheduler-0\" (UID: \"4c800d9a-2aa1-49df-a30b-23f36117381b\") " pod="openstack/manila-scheduler-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.885448 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-ceph\") pod \"manila-share-share1-0\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " pod="openstack/manila-share-share1-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.937073 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-jwp25"] Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.945020 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b5fdb995-jwp25" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.979146 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-jwp25"] Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.989193 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndbrl\" (UniqueName: \"kubernetes.io/projected/66a9cb74-956c-4846-91b9-a4dac0834347-kube-api-access-ndbrl\") pod \"dnsmasq-dns-76b5fdb995-jwp25\" (UID: \"66a9cb74-956c-4846-91b9-a4dac0834347\") " pod="openstack/dnsmasq-dns-76b5fdb995-jwp25" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.989567 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66a9cb74-956c-4846-91b9-a4dac0834347-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-jwp25\" (UID: \"66a9cb74-956c-4846-91b9-a4dac0834347\") " pod="openstack/dnsmasq-dns-76b5fdb995-jwp25" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.989658 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-ceph\") pod \"manila-share-share1-0\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " pod="openstack/manila-share-share1-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.989732 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " pod="openstack/manila-share-share1-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.989866 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c800d9a-2aa1-49df-a30b-23f36117381b-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"4c800d9a-2aa1-49df-a30b-23f36117381b\") " pod="openstack/manila-scheduler-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.989985 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " pod="openstack/manila-share-share1-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.990068 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " pod="openstack/manila-share-share1-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.990197 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c800d9a-2aa1-49df-a30b-23f36117381b-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"4c800d9a-2aa1-49df-a30b-23f36117381b\") " pod="openstack/manila-scheduler-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.990301 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " pod="openstack/manila-share-share1-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.990379 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/66a9cb74-956c-4846-91b9-a4dac0834347-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-jwp25\" (UID: \"66a9cb74-956c-4846-91b9-a4dac0834347\") " pod="openstack/dnsmasq-dns-76b5fdb995-jwp25" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.990470 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c800d9a-2aa1-49df-a30b-23f36117381b-scripts\") pod \"manila-scheduler-0\" (UID: \"4c800d9a-2aa1-49df-a30b-23f36117381b\") " pod="openstack/manila-scheduler-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.990580 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66a9cb74-956c-4846-91b9-a4dac0834347-config\") pod \"dnsmasq-dns-76b5fdb995-jwp25\" (UID: \"66a9cb74-956c-4846-91b9-a4dac0834347\") " pod="openstack/dnsmasq-dns-76b5fdb995-jwp25" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.990665 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-scripts\") pod \"manila-share-share1-0\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " pod="openstack/manila-share-share1-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.990742 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66a9cb74-956c-4846-91b9-a4dac0834347-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-jwp25\" (UID: \"66a9cb74-956c-4846-91b9-a4dac0834347\") " pod="openstack/dnsmasq-dns-76b5fdb995-jwp25" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.990823 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4c800d9a-2aa1-49df-a30b-23f36117381b-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"4c800d9a-2aa1-49df-a30b-23f36117381b\") " pod="openstack/manila-scheduler-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.990923 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-config-data\") pod \"manila-share-share1-0\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " pod="openstack/manila-share-share1-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.990996 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66a9cb74-956c-4846-91b9-a4dac0834347-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-jwp25\" (UID: \"66a9cb74-956c-4846-91b9-a4dac0834347\") " pod="openstack/dnsmasq-dns-76b5fdb995-jwp25" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.991072 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfwjl\" (UniqueName: \"kubernetes.io/projected/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-kube-api-access-tfwjl\") pod \"manila-share-share1-0\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " pod="openstack/manila-share-share1-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.991148 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rknr\" (UniqueName: \"kubernetes.io/projected/4c800d9a-2aa1-49df-a30b-23f36117381b-kube-api-access-4rknr\") pod \"manila-scheduler-0\" (UID: \"4c800d9a-2aa1-49df-a30b-23f36117381b\") " pod="openstack/manila-scheduler-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.991256 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c800d9a-2aa1-49df-a30b-23f36117381b-config-data\") pod \"manila-scheduler-0\" (UID: \"4c800d9a-2aa1-49df-a30b-23f36117381b\") " pod="openstack/manila-scheduler-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.992326 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4c800d9a-2aa1-49df-a30b-23f36117381b-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"4c800d9a-2aa1-49df-a30b-23f36117381b\") " pod="openstack/manila-scheduler-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.993412 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " pod="openstack/manila-share-share1-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.996533 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " pod="openstack/manila-share-share1-0" Dec 01 20:49:41 crc kubenswrapper[4802]: I1201 20:49:41.999743 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c800d9a-2aa1-49df-a30b-23f36117381b-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"4c800d9a-2aa1-49df-a30b-23f36117381b\") " pod="openstack/manila-scheduler-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.001553 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " pod="openstack/manila-share-share1-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.001590 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-ceph\") pod \"manila-share-share1-0\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " pod="openstack/manila-share-share1-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.002516 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c800d9a-2aa1-49df-a30b-23f36117381b-config-data\") pod \"manila-scheduler-0\" (UID: \"4c800d9a-2aa1-49df-a30b-23f36117381b\") " pod="openstack/manila-scheduler-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.005430 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-config-data\") pod \"manila-share-share1-0\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " pod="openstack/manila-share-share1-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.005877 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c800d9a-2aa1-49df-a30b-23f36117381b-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"4c800d9a-2aa1-49df-a30b-23f36117381b\") " pod="openstack/manila-scheduler-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.008608 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-scripts\") pod \"manila-share-share1-0\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " pod="openstack/manila-share-share1-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.012162 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " pod="openstack/manila-share-share1-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.013518 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c800d9a-2aa1-49df-a30b-23f36117381b-scripts\") pod \"manila-scheduler-0\" (UID: \"4c800d9a-2aa1-49df-a30b-23f36117381b\") " pod="openstack/manila-scheduler-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.042509 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfwjl\" (UniqueName: \"kubernetes.io/projected/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-kube-api-access-tfwjl\") pod \"manila-share-share1-0\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " pod="openstack/manila-share-share1-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.044837 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rknr\" (UniqueName: \"kubernetes.io/projected/4c800d9a-2aa1-49df-a30b-23f36117381b-kube-api-access-4rknr\") pod \"manila-scheduler-0\" (UID: \"4c800d9a-2aa1-49df-a30b-23f36117381b\") " pod="openstack/manila-scheduler-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.094890 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndbrl\" (UniqueName: \"kubernetes.io/projected/66a9cb74-956c-4846-91b9-a4dac0834347-kube-api-access-ndbrl\") pod \"dnsmasq-dns-76b5fdb995-jwp25\" (UID: \"66a9cb74-956c-4846-91b9-a4dac0834347\") " pod="openstack/dnsmasq-dns-76b5fdb995-jwp25" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.095354 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66a9cb74-956c-4846-91b9-a4dac0834347-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-jwp25\" (UID: \"66a9cb74-956c-4846-91b9-a4dac0834347\") " pod="openstack/dnsmasq-dns-76b5fdb995-jwp25" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.095535 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/66a9cb74-956c-4846-91b9-a4dac0834347-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-jwp25\" (UID: \"66a9cb74-956c-4846-91b9-a4dac0834347\") " pod="openstack/dnsmasq-dns-76b5fdb995-jwp25" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.095635 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66a9cb74-956c-4846-91b9-a4dac0834347-config\") pod \"dnsmasq-dns-76b5fdb995-jwp25\" (UID: \"66a9cb74-956c-4846-91b9-a4dac0834347\") " pod="openstack/dnsmasq-dns-76b5fdb995-jwp25" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.095714 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66a9cb74-956c-4846-91b9-a4dac0834347-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-jwp25\" (UID: \"66a9cb74-956c-4846-91b9-a4dac0834347\") " pod="openstack/dnsmasq-dns-76b5fdb995-jwp25" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.095878 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66a9cb74-956c-4846-91b9-a4dac0834347-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-jwp25\" (UID: \"66a9cb74-956c-4846-91b9-a4dac0834347\") " pod="openstack/dnsmasq-dns-76b5fdb995-jwp25" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.097089 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66a9cb74-956c-4846-91b9-a4dac0834347-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-jwp25\" (UID: \"66a9cb74-956c-4846-91b9-a4dac0834347\") " pod="openstack/dnsmasq-dns-76b5fdb995-jwp25" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.098091 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/66a9cb74-956c-4846-91b9-a4dac0834347-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-jwp25\" (UID: \"66a9cb74-956c-4846-91b9-a4dac0834347\") " pod="openstack/dnsmasq-dns-76b5fdb995-jwp25" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.110727 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.111814 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66a9cb74-956c-4846-91b9-a4dac0834347-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-jwp25\" (UID: \"66a9cb74-956c-4846-91b9-a4dac0834347\") " pod="openstack/dnsmasq-dns-76b5fdb995-jwp25" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.112228 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66a9cb74-956c-4846-91b9-a4dac0834347-config\") pod \"dnsmasq-dns-76b5fdb995-jwp25\" (UID: \"66a9cb74-956c-4846-91b9-a4dac0834347\") " pod="openstack/dnsmasq-dns-76b5fdb995-jwp25" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.114921 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66a9cb74-956c-4846-91b9-a4dac0834347-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-jwp25\" (UID: \"66a9cb74-956c-4846-91b9-a4dac0834347\") " pod="openstack/dnsmasq-dns-76b5fdb995-jwp25" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.126938 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.158873 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndbrl\" (UniqueName: \"kubernetes.io/projected/66a9cb74-956c-4846-91b9-a4dac0834347-kube-api-access-ndbrl\") pod \"dnsmasq-dns-76b5fdb995-jwp25\" (UID: \"66a9cb74-956c-4846-91b9-a4dac0834347\") " pod="openstack/dnsmasq-dns-76b5fdb995-jwp25" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.171502 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.173080 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.191822 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.241272 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.263534 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b5fdb995-jwp25" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.307286 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5130da2-132e-4a81-ace7-f1e9d4d790d4-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\") " pod="openstack/manila-api-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.307336 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5130da2-132e-4a81-ace7-f1e9d4d790d4-config-data\") pod \"manila-api-0\" (UID: \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\") " pod="openstack/manila-api-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.307379 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn6cs\" (UniqueName: \"kubernetes.io/projected/d5130da2-132e-4a81-ace7-f1e9d4d790d4-kube-api-access-nn6cs\") pod \"manila-api-0\" (UID: \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\") " pod="openstack/manila-api-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.307437 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d5130da2-132e-4a81-ace7-f1e9d4d790d4-etc-machine-id\") pod \"manila-api-0\" (UID: \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\") " pod="openstack/manila-api-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.307455 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5130da2-132e-4a81-ace7-f1e9d4d790d4-scripts\") pod \"manila-api-0\" (UID: \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\") " pod="openstack/manila-api-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.307507 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5130da2-132e-4a81-ace7-f1e9d4d790d4-config-data-custom\") pod \"manila-api-0\" (UID: \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\") " pod="openstack/manila-api-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.307525 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5130da2-132e-4a81-ace7-f1e9d4d790d4-logs\") pod \"manila-api-0\" (UID: \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\") " pod="openstack/manila-api-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.410458 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5130da2-132e-4a81-ace7-f1e9d4d790d4-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\") " pod="openstack/manila-api-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.410530 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5130da2-132e-4a81-ace7-f1e9d4d790d4-config-data\") pod \"manila-api-0\" (UID: \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\") " pod="openstack/manila-api-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.410578 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn6cs\" (UniqueName: \"kubernetes.io/projected/d5130da2-132e-4a81-ace7-f1e9d4d790d4-kube-api-access-nn6cs\") pod \"manila-api-0\" (UID: \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\") " pod="openstack/manila-api-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.410642 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d5130da2-132e-4a81-ace7-f1e9d4d790d4-etc-machine-id\") pod \"manila-api-0\" (UID: \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\") " pod="openstack/manila-api-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.410658 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5130da2-132e-4a81-ace7-f1e9d4d790d4-scripts\") pod \"manila-api-0\" (UID: \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\") " pod="openstack/manila-api-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.410712 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5130da2-132e-4a81-ace7-f1e9d4d790d4-config-data-custom\") pod \"manila-api-0\" (UID: \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\") " pod="openstack/manila-api-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.410731 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5130da2-132e-4a81-ace7-f1e9d4d790d4-logs\") pod \"manila-api-0\" (UID: \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\") " pod="openstack/manila-api-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.412437 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d5130da2-132e-4a81-ace7-f1e9d4d790d4-etc-machine-id\") pod \"manila-api-0\" (UID: \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\") " pod="openstack/manila-api-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.412622 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5130da2-132e-4a81-ace7-f1e9d4d790d4-logs\") pod \"manila-api-0\" (UID: \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\") " pod="openstack/manila-api-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.422852 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5130da2-132e-4a81-ace7-f1e9d4d790d4-scripts\") pod \"manila-api-0\" (UID: \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\") " pod="openstack/manila-api-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.422968 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5130da2-132e-4a81-ace7-f1e9d4d790d4-config-data-custom\") pod \"manila-api-0\" (UID: \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\") " pod="openstack/manila-api-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.426022 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5130da2-132e-4a81-ace7-f1e9d4d790d4-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\") " pod="openstack/manila-api-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.429034 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5130da2-132e-4a81-ace7-f1e9d4d790d4-config-data\") pod \"manila-api-0\" (UID: \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\") " pod="openstack/manila-api-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.449867 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn6cs\" (UniqueName: \"kubernetes.io/projected/d5130da2-132e-4a81-ace7-f1e9d4d790d4-kube-api-access-nn6cs\") pod \"manila-api-0\" (UID: \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\") " pod="openstack/manila-api-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.602285 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 01 20:49:42 crc kubenswrapper[4802]: I1201 20:49:42.960389 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 01 20:49:42 crc kubenswrapper[4802]: W1201 20:49:42.985431 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c800d9a_2aa1_49df_a30b_23f36117381b.slice/crio-83a1a3bfe464258a6ebd0d7e9d32efaa88dfaeb7b81b0e896c50eab590ce42e6 WatchSource:0}: Error finding container 83a1a3bfe464258a6ebd0d7e9d32efaa88dfaeb7b81b0e896c50eab590ce42e6: Status 404 returned error can't find the container with id 83a1a3bfe464258a6ebd0d7e9d32efaa88dfaeb7b81b0e896c50eab590ce42e6 Dec 01 20:49:43 crc kubenswrapper[4802]: I1201 20:49:43.132635 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-jwp25"] Dec 01 20:49:43 crc kubenswrapper[4802]: I1201 20:49:43.143268 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 01 20:49:43 crc kubenswrapper[4802]: I1201 20:49:43.532525 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1","Type":"ContainerStarted","Data":"98e98be1dc1a8f5628b6d377e4a7ea156b46672f1c5b38e99093ff3dff89a6f8"} Dec 01 20:49:43 crc kubenswrapper[4802]: I1201 20:49:43.548637 4802 generic.go:334] "Generic (PLEG): container finished" podID="66a9cb74-956c-4846-91b9-a4dac0834347" containerID="452e63a8a82e019caa0632ee4f45c97c98ba86a6d00623ef2b11342c3f8f6e8c" exitCode=0 Dec 01 20:49:43 crc kubenswrapper[4802]: I1201 20:49:43.548750 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-jwp25" event={"ID":"66a9cb74-956c-4846-91b9-a4dac0834347","Type":"ContainerDied","Data":"452e63a8a82e019caa0632ee4f45c97c98ba86a6d00623ef2b11342c3f8f6e8c"} Dec 01 20:49:43 crc kubenswrapper[4802]: I1201 20:49:43.548833 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-jwp25" event={"ID":"66a9cb74-956c-4846-91b9-a4dac0834347","Type":"ContainerStarted","Data":"44c7ec942b4f9c6dfdac87070e529adb5a581bb740c6fa587fef251f68df4a91"} Dec 01 20:49:43 crc kubenswrapper[4802]: I1201 20:49:43.549976 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"4c800d9a-2aa1-49df-a30b-23f36117381b","Type":"ContainerStarted","Data":"83a1a3bfe464258a6ebd0d7e9d32efaa88dfaeb7b81b0e896c50eab590ce42e6"} Dec 01 20:49:44 crc kubenswrapper[4802]: I1201 20:49:44.111840 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 01 20:49:44 crc kubenswrapper[4802]: I1201 20:49:44.571170 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"d5130da2-132e-4a81-ace7-f1e9d4d790d4","Type":"ContainerStarted","Data":"8d00dc63a05405e4bc50f0d4915cf0af4435f5cee92c09b46ce9d7c31ec5991b"} Dec 01 20:49:44 crc kubenswrapper[4802]: I1201 20:49:44.575643 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-jwp25" event={"ID":"66a9cb74-956c-4846-91b9-a4dac0834347","Type":"ContainerStarted","Data":"258ed9f970d7bf78003c32533e07939a14cf1f2d5b8ae66eadb92deeca533dc9"} Dec 01 20:49:44 crc kubenswrapper[4802]: I1201 20:49:44.576811 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76b5fdb995-jwp25" Dec 01 20:49:44 crc kubenswrapper[4802]: I1201 20:49:44.601652 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76b5fdb995-jwp25" podStartSLOduration=3.601635907 podStartE2EDuration="3.601635907s" podCreationTimestamp="2025-12-01 20:49:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:49:44.598216121 +0000 UTC m=+3206.160775762" watchObservedRunningTime="2025-12-01 20:49:44.601635907 +0000 UTC m=+3206.164195548" Dec 01 20:49:45 crc kubenswrapper[4802]: I1201 20:49:45.300861 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Dec 01 20:49:45 crc kubenswrapper[4802]: I1201 20:49:45.597528 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"d5130da2-132e-4a81-ace7-f1e9d4d790d4","Type":"ContainerStarted","Data":"f27a5ac74821ea0c92bd4231f3b365d3cd4039c4e8cf4d2d309e980e49445a18"} Dec 01 20:49:45 crc kubenswrapper[4802]: I1201 20:49:45.602302 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"4c800d9a-2aa1-49df-a30b-23f36117381b","Type":"ContainerStarted","Data":"9a7f21b2c785699936a502845d10ba53cad44de5790b6ec6da2d0a8ade617154"} Dec 01 20:49:46 crc kubenswrapper[4802]: I1201 20:49:46.615261 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"d5130da2-132e-4a81-ace7-f1e9d4d790d4","Type":"ContainerStarted","Data":"e8676abea58b0fb8692bdc1c71919889eab8f597bf2e49fbc280614927658334"} Dec 01 20:49:46 crc kubenswrapper[4802]: I1201 20:49:46.615624 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Dec 01 20:49:46 crc kubenswrapper[4802]: I1201 20:49:46.615319 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="d5130da2-132e-4a81-ace7-f1e9d4d790d4" containerName="manila-api-log" containerID="cri-o://f27a5ac74821ea0c92bd4231f3b365d3cd4039c4e8cf4d2d309e980e49445a18" gracePeriod=30 Dec 01 20:49:46 crc kubenswrapper[4802]: I1201 20:49:46.615732 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="d5130da2-132e-4a81-ace7-f1e9d4d790d4" containerName="manila-api" containerID="cri-o://e8676abea58b0fb8692bdc1c71919889eab8f597bf2e49fbc280614927658334" gracePeriod=30 Dec 01 20:49:46 crc kubenswrapper[4802]: I1201 20:49:46.623052 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"4c800d9a-2aa1-49df-a30b-23f36117381b","Type":"ContainerStarted","Data":"92edfdff569346e567dd26f9a973676083700d25c40a2bc4cd054f190ba85cd9"} Dec 01 20:49:46 crc kubenswrapper[4802]: I1201 20:49:46.645386 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.645359366 podStartE2EDuration="4.645359366s" podCreationTimestamp="2025-12-01 20:49:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:49:46.635611722 +0000 UTC m=+3208.198171383" watchObservedRunningTime="2025-12-01 20:49:46.645359366 +0000 UTC m=+3208.207919007" Dec 01 20:49:46 crc kubenswrapper[4802]: I1201 20:49:46.668404 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=4.869125202 podStartE2EDuration="5.668381773s" podCreationTimestamp="2025-12-01 20:49:41 +0000 UTC" firstStartedPulling="2025-12-01 20:49:42.988881522 +0000 UTC m=+3204.551441163" lastFinishedPulling="2025-12-01 20:49:43.788138093 +0000 UTC m=+3205.350697734" observedRunningTime="2025-12-01 20:49:46.659466645 +0000 UTC m=+3208.222026296" watchObservedRunningTime="2025-12-01 20:49:46.668381773 +0000 UTC m=+3208.230941414" Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.477973 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.633775 4802 generic.go:334] "Generic (PLEG): container finished" podID="d5130da2-132e-4a81-ace7-f1e9d4d790d4" containerID="e8676abea58b0fb8692bdc1c71919889eab8f597bf2e49fbc280614927658334" exitCode=0 Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.633817 4802 generic.go:334] "Generic (PLEG): container finished" podID="d5130da2-132e-4a81-ace7-f1e9d4d790d4" containerID="f27a5ac74821ea0c92bd4231f3b365d3cd4039c4e8cf4d2d309e980e49445a18" exitCode=143 Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.633833 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.633858 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"d5130da2-132e-4a81-ace7-f1e9d4d790d4","Type":"ContainerDied","Data":"e8676abea58b0fb8692bdc1c71919889eab8f597bf2e49fbc280614927658334"} Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.633897 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"d5130da2-132e-4a81-ace7-f1e9d4d790d4","Type":"ContainerDied","Data":"f27a5ac74821ea0c92bd4231f3b365d3cd4039c4e8cf4d2d309e980e49445a18"} Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.633908 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"d5130da2-132e-4a81-ace7-f1e9d4d790d4","Type":"ContainerDied","Data":"8d00dc63a05405e4bc50f0d4915cf0af4435f5cee92c09b46ce9d7c31ec5991b"} Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.633931 4802 scope.go:117] "RemoveContainer" containerID="e8676abea58b0fb8692bdc1c71919889eab8f597bf2e49fbc280614927658334" Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.646945 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5130da2-132e-4a81-ace7-f1e9d4d790d4-combined-ca-bundle\") pod \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\" (UID: \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\") " Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.646986 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn6cs\" (UniqueName: \"kubernetes.io/projected/d5130da2-132e-4a81-ace7-f1e9d4d790d4-kube-api-access-nn6cs\") pod \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\" (UID: \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\") " Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.647065 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5130da2-132e-4a81-ace7-f1e9d4d790d4-logs\") pod \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\" (UID: \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\") " Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.647141 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5130da2-132e-4a81-ace7-f1e9d4d790d4-config-data-custom\") pod \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\" (UID: \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\") " Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.647164 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d5130da2-132e-4a81-ace7-f1e9d4d790d4-etc-machine-id\") pod \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\" (UID: \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\") " Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.647833 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5130da2-132e-4a81-ace7-f1e9d4d790d4-logs" (OuterVolumeSpecName: "logs") pod "d5130da2-132e-4a81-ace7-f1e9d4d790d4" (UID: "d5130da2-132e-4a81-ace7-f1e9d4d790d4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.647937 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5130da2-132e-4a81-ace7-f1e9d4d790d4-config-data\") pod \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\" (UID: \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\") " Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.648044 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5130da2-132e-4a81-ace7-f1e9d4d790d4-scripts\") pod \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\" (UID: \"d5130da2-132e-4a81-ace7-f1e9d4d790d4\") " Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.648608 4802 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5130da2-132e-4a81-ace7-f1e9d4d790d4-logs\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.648709 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5130da2-132e-4a81-ace7-f1e9d4d790d4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d5130da2-132e-4a81-ace7-f1e9d4d790d4" (UID: "d5130da2-132e-4a81-ace7-f1e9d4d790d4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.652960 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5130da2-132e-4a81-ace7-f1e9d4d790d4-scripts" (OuterVolumeSpecName: "scripts") pod "d5130da2-132e-4a81-ace7-f1e9d4d790d4" (UID: "d5130da2-132e-4a81-ace7-f1e9d4d790d4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.653993 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5130da2-132e-4a81-ace7-f1e9d4d790d4-kube-api-access-nn6cs" (OuterVolumeSpecName: "kube-api-access-nn6cs") pod "d5130da2-132e-4a81-ace7-f1e9d4d790d4" (UID: "d5130da2-132e-4a81-ace7-f1e9d4d790d4"). InnerVolumeSpecName "kube-api-access-nn6cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.657297 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5130da2-132e-4a81-ace7-f1e9d4d790d4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d5130da2-132e-4a81-ace7-f1e9d4d790d4" (UID: "d5130da2-132e-4a81-ace7-f1e9d4d790d4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.662374 4802 scope.go:117] "RemoveContainer" containerID="f27a5ac74821ea0c92bd4231f3b365d3cd4039c4e8cf4d2d309e980e49445a18" Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.699041 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5130da2-132e-4a81-ace7-f1e9d4d790d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5130da2-132e-4a81-ace7-f1e9d4d790d4" (UID: "d5130da2-132e-4a81-ace7-f1e9d4d790d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.743367 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5130da2-132e-4a81-ace7-f1e9d4d790d4-config-data" (OuterVolumeSpecName: "config-data") pod "d5130da2-132e-4a81-ace7-f1e9d4d790d4" (UID: "d5130da2-132e-4a81-ace7-f1e9d4d790d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.751088 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5130da2-132e-4a81-ace7-f1e9d4d790d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.751138 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn6cs\" (UniqueName: \"kubernetes.io/projected/d5130da2-132e-4a81-ace7-f1e9d4d790d4-kube-api-access-nn6cs\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.751150 4802 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5130da2-132e-4a81-ace7-f1e9d4d790d4-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.751162 4802 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d5130da2-132e-4a81-ace7-f1e9d4d790d4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.751171 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5130da2-132e-4a81-ace7-f1e9d4d790d4-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.751205 4802 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5130da2-132e-4a81-ace7-f1e9d4d790d4-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.755971 4802 scope.go:117] "RemoveContainer" containerID="e8676abea58b0fb8692bdc1c71919889eab8f597bf2e49fbc280614927658334" Dec 01 20:49:47 crc kubenswrapper[4802]: E1201 20:49:47.756895 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8676abea58b0fb8692bdc1c71919889eab8f597bf2e49fbc280614927658334\": container with ID starting with e8676abea58b0fb8692bdc1c71919889eab8f597bf2e49fbc280614927658334 not found: ID does not exist" containerID="e8676abea58b0fb8692bdc1c71919889eab8f597bf2e49fbc280614927658334" Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.756936 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8676abea58b0fb8692bdc1c71919889eab8f597bf2e49fbc280614927658334"} err="failed to get container status \"e8676abea58b0fb8692bdc1c71919889eab8f597bf2e49fbc280614927658334\": rpc error: code = NotFound desc = could not find container \"e8676abea58b0fb8692bdc1c71919889eab8f597bf2e49fbc280614927658334\": container with ID starting with e8676abea58b0fb8692bdc1c71919889eab8f597bf2e49fbc280614927658334 not found: ID does not exist" Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.756960 4802 scope.go:117] "RemoveContainer" containerID="f27a5ac74821ea0c92bd4231f3b365d3cd4039c4e8cf4d2d309e980e49445a18" Dec 01 20:49:47 crc kubenswrapper[4802]: E1201 20:49:47.757459 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f27a5ac74821ea0c92bd4231f3b365d3cd4039c4e8cf4d2d309e980e49445a18\": container with ID starting with f27a5ac74821ea0c92bd4231f3b365d3cd4039c4e8cf4d2d309e980e49445a18 not found: ID does not exist" containerID="f27a5ac74821ea0c92bd4231f3b365d3cd4039c4e8cf4d2d309e980e49445a18" Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.757481 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f27a5ac74821ea0c92bd4231f3b365d3cd4039c4e8cf4d2d309e980e49445a18"} err="failed to get container status \"f27a5ac74821ea0c92bd4231f3b365d3cd4039c4e8cf4d2d309e980e49445a18\": rpc error: code = NotFound desc = could not find container \"f27a5ac74821ea0c92bd4231f3b365d3cd4039c4e8cf4d2d309e980e49445a18\": container with ID starting with f27a5ac74821ea0c92bd4231f3b365d3cd4039c4e8cf4d2d309e980e49445a18 not found: ID does not exist" Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.757506 4802 scope.go:117] "RemoveContainer" containerID="e8676abea58b0fb8692bdc1c71919889eab8f597bf2e49fbc280614927658334" Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.759259 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8676abea58b0fb8692bdc1c71919889eab8f597bf2e49fbc280614927658334"} err="failed to get container status \"e8676abea58b0fb8692bdc1c71919889eab8f597bf2e49fbc280614927658334\": rpc error: code = NotFound desc = could not find container \"e8676abea58b0fb8692bdc1c71919889eab8f597bf2e49fbc280614927658334\": container with ID starting with e8676abea58b0fb8692bdc1c71919889eab8f597bf2e49fbc280614927658334 not found: ID does not exist" Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.759288 4802 scope.go:117] "RemoveContainer" containerID="f27a5ac74821ea0c92bd4231f3b365d3cd4039c4e8cf4d2d309e980e49445a18" Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.759669 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f27a5ac74821ea0c92bd4231f3b365d3cd4039c4e8cf4d2d309e980e49445a18"} err="failed to get container status \"f27a5ac74821ea0c92bd4231f3b365d3cd4039c4e8cf4d2d309e980e49445a18\": rpc error: code = NotFound desc = could not find container \"f27a5ac74821ea0c92bd4231f3b365d3cd4039c4e8cf4d2d309e980e49445a18\": container with ID starting with f27a5ac74821ea0c92bd4231f3b365d3cd4039c4e8cf4d2d309e980e49445a18 not found: ID does not exist" Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.977782 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Dec 01 20:49:47 crc kubenswrapper[4802]: I1201 20:49:47.991451 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.000930 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Dec 01 20:49:48 crc kubenswrapper[4802]: E1201 20:49:48.001556 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5130da2-132e-4a81-ace7-f1e9d4d790d4" containerName="manila-api-log" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.001569 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5130da2-132e-4a81-ace7-f1e9d4d790d4" containerName="manila-api-log" Dec 01 20:49:48 crc kubenswrapper[4802]: E1201 20:49:48.001611 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5130da2-132e-4a81-ace7-f1e9d4d790d4" containerName="manila-api" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.001617 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5130da2-132e-4a81-ace7-f1e9d4d790d4" containerName="manila-api" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.001891 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5130da2-132e-4a81-ace7-f1e9d4d790d4" containerName="manila-api" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.001936 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5130da2-132e-4a81-ace7-f1e9d4d790d4" containerName="manila-api-log" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.004646 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.009748 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.010073 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.013351 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.025565 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.158774 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cde0d25e-888c-44b9-95a0-3bdae318a8b0-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"cde0d25e-888c-44b9-95a0-3bdae318a8b0\") " pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.158839 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cde0d25e-888c-44b9-95a0-3bdae318a8b0-scripts\") pod \"manila-api-0\" (UID: \"cde0d25e-888c-44b9-95a0-3bdae318a8b0\") " pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.158872 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvshh\" (UniqueName: \"kubernetes.io/projected/cde0d25e-888c-44b9-95a0-3bdae318a8b0-kube-api-access-kvshh\") pod \"manila-api-0\" (UID: \"cde0d25e-888c-44b9-95a0-3bdae318a8b0\") " pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.159653 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cde0d25e-888c-44b9-95a0-3bdae318a8b0-etc-machine-id\") pod \"manila-api-0\" (UID: \"cde0d25e-888c-44b9-95a0-3bdae318a8b0\") " pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.159692 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cde0d25e-888c-44b9-95a0-3bdae318a8b0-internal-tls-certs\") pod \"manila-api-0\" (UID: \"cde0d25e-888c-44b9-95a0-3bdae318a8b0\") " pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.159768 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cde0d25e-888c-44b9-95a0-3bdae318a8b0-logs\") pod \"manila-api-0\" (UID: \"cde0d25e-888c-44b9-95a0-3bdae318a8b0\") " pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.159824 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cde0d25e-888c-44b9-95a0-3bdae318a8b0-public-tls-certs\") pod \"manila-api-0\" (UID: \"cde0d25e-888c-44b9-95a0-3bdae318a8b0\") " pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.159877 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cde0d25e-888c-44b9-95a0-3bdae318a8b0-config-data\") pod \"manila-api-0\" (UID: \"cde0d25e-888c-44b9-95a0-3bdae318a8b0\") " pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.160088 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cde0d25e-888c-44b9-95a0-3bdae318a8b0-config-data-custom\") pod \"manila-api-0\" (UID: \"cde0d25e-888c-44b9-95a0-3bdae318a8b0\") " pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.262594 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cde0d25e-888c-44b9-95a0-3bdae318a8b0-config-data-custom\") pod \"manila-api-0\" (UID: \"cde0d25e-888c-44b9-95a0-3bdae318a8b0\") " pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.262747 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cde0d25e-888c-44b9-95a0-3bdae318a8b0-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"cde0d25e-888c-44b9-95a0-3bdae318a8b0\") " pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.262809 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cde0d25e-888c-44b9-95a0-3bdae318a8b0-scripts\") pod \"manila-api-0\" (UID: \"cde0d25e-888c-44b9-95a0-3bdae318a8b0\") " pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.262842 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvshh\" (UniqueName: \"kubernetes.io/projected/cde0d25e-888c-44b9-95a0-3bdae318a8b0-kube-api-access-kvshh\") pod \"manila-api-0\" (UID: \"cde0d25e-888c-44b9-95a0-3bdae318a8b0\") " pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.262942 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cde0d25e-888c-44b9-95a0-3bdae318a8b0-etc-machine-id\") pod \"manila-api-0\" (UID: \"cde0d25e-888c-44b9-95a0-3bdae318a8b0\") " pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.262972 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cde0d25e-888c-44b9-95a0-3bdae318a8b0-internal-tls-certs\") pod \"manila-api-0\" (UID: \"cde0d25e-888c-44b9-95a0-3bdae318a8b0\") " pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.263009 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cde0d25e-888c-44b9-95a0-3bdae318a8b0-logs\") pod \"manila-api-0\" (UID: \"cde0d25e-888c-44b9-95a0-3bdae318a8b0\") " pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.263047 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cde0d25e-888c-44b9-95a0-3bdae318a8b0-public-tls-certs\") pod \"manila-api-0\" (UID: \"cde0d25e-888c-44b9-95a0-3bdae318a8b0\") " pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.263080 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cde0d25e-888c-44b9-95a0-3bdae318a8b0-config-data\") pod \"manila-api-0\" (UID: \"cde0d25e-888c-44b9-95a0-3bdae318a8b0\") " pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.265376 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cde0d25e-888c-44b9-95a0-3bdae318a8b0-etc-machine-id\") pod \"manila-api-0\" (UID: \"cde0d25e-888c-44b9-95a0-3bdae318a8b0\") " pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.265679 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cde0d25e-888c-44b9-95a0-3bdae318a8b0-logs\") pod \"manila-api-0\" (UID: \"cde0d25e-888c-44b9-95a0-3bdae318a8b0\") " pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.276878 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cde0d25e-888c-44b9-95a0-3bdae318a8b0-internal-tls-certs\") pod \"manila-api-0\" (UID: \"cde0d25e-888c-44b9-95a0-3bdae318a8b0\") " pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.276995 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cde0d25e-888c-44b9-95a0-3bdae318a8b0-config-data-custom\") pod \"manila-api-0\" (UID: \"cde0d25e-888c-44b9-95a0-3bdae318a8b0\") " pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.278915 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cde0d25e-888c-44b9-95a0-3bdae318a8b0-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"cde0d25e-888c-44b9-95a0-3bdae318a8b0\") " pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.281502 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cde0d25e-888c-44b9-95a0-3bdae318a8b0-scripts\") pod \"manila-api-0\" (UID: \"cde0d25e-888c-44b9-95a0-3bdae318a8b0\") " pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.287335 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cde0d25e-888c-44b9-95a0-3bdae318a8b0-public-tls-certs\") pod \"manila-api-0\" (UID: \"cde0d25e-888c-44b9-95a0-3bdae318a8b0\") " pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.287818 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cde0d25e-888c-44b9-95a0-3bdae318a8b0-config-data\") pod \"manila-api-0\" (UID: \"cde0d25e-888c-44b9-95a0-3bdae318a8b0\") " pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.289241 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvshh\" (UniqueName: \"kubernetes.io/projected/cde0d25e-888c-44b9-95a0-3bdae318a8b0-kube-api-access-kvshh\") pod \"manila-api-0\" (UID: \"cde0d25e-888c-44b9-95a0-3bdae318a8b0\") " pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.335581 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 01 20:49:48 crc kubenswrapper[4802]: I1201 20:49:48.746913 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5130da2-132e-4a81-ace7-f1e9d4d790d4" path="/var/lib/kubelet/pods/d5130da2-132e-4a81-ace7-f1e9d4d790d4/volumes" Dec 01 20:49:49 crc kubenswrapper[4802]: I1201 20:49:49.041227 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 01 20:49:49 crc kubenswrapper[4802]: I1201 20:49:49.672802 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"cde0d25e-888c-44b9-95a0-3bdae318a8b0","Type":"ContainerStarted","Data":"b37d102b242e16d3f1ccc6dffb3fe3a4aea226b3be6cb35403b2430567f535b7"} Dec 01 20:49:49 crc kubenswrapper[4802]: I1201 20:49:49.673289 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"cde0d25e-888c-44b9-95a0-3bdae318a8b0","Type":"ContainerStarted","Data":"e58df6c0016c384c482a69ea178dbd0a132a361c991c71cbe353b559f81bcd66"} Dec 01 20:49:51 crc kubenswrapper[4802]: I1201 20:49:51.720163 4802 scope.go:117] "RemoveContainer" containerID="538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098" Dec 01 20:49:51 crc kubenswrapper[4802]: E1201 20:49:51.720665 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:49:52 crc kubenswrapper[4802]: I1201 20:49:52.112185 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Dec 01 20:49:52 crc kubenswrapper[4802]: I1201 20:49:52.266389 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76b5fdb995-jwp25" Dec 01 20:49:52 crc kubenswrapper[4802]: I1201 20:49:52.365822 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-4klz2"] Dec 01 20:49:52 crc kubenswrapper[4802]: I1201 20:49:52.366102 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" podUID="8dfcb60c-7f93-47a3-8343-2d99177de7f2" containerName="dnsmasq-dns" containerID="cri-o://3aadd1e1bb764aef360aa9ef7b8db58ba64fd44f8183f4f390db9c748c4948cd" gracePeriod=10 Dec 01 20:49:52 crc kubenswrapper[4802]: I1201 20:49:52.703722 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Dec 01 20:49:52 crc kubenswrapper[4802]: I1201 20:49:52.712852 4802 generic.go:334] "Generic (PLEG): container finished" podID="8dfcb60c-7f93-47a3-8343-2d99177de7f2" containerID="3aadd1e1bb764aef360aa9ef7b8db58ba64fd44f8183f4f390db9c748c4948cd" exitCode=0 Dec 01 20:49:52 crc kubenswrapper[4802]: I1201 20:49:52.712911 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" event={"ID":"8dfcb60c-7f93-47a3-8343-2d99177de7f2","Type":"ContainerDied","Data":"3aadd1e1bb764aef360aa9ef7b8db58ba64fd44f8183f4f390db9c748c4948cd"} Dec 01 20:49:52 crc kubenswrapper[4802]: I1201 20:49:52.747846 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=5.747826856 podStartE2EDuration="5.747826856s" podCreationTimestamp="2025-12-01 20:49:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:49:52.736463462 +0000 UTC m=+3214.299023103" watchObservedRunningTime="2025-12-01 20:49:52.747826856 +0000 UTC m=+3214.310386497" Dec 01 20:49:52 crc kubenswrapper[4802]: I1201 20:49:52.946098 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" Dec 01 20:49:52 crc kubenswrapper[4802]: I1201 20:49:52.989941 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-494kb\" (UniqueName: \"kubernetes.io/projected/8dfcb60c-7f93-47a3-8343-2d99177de7f2-kube-api-access-494kb\") pod \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\" (UID: \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\") " Dec 01 20:49:52 crc kubenswrapper[4802]: I1201 20:49:52.990013 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-dns-svc\") pod \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\" (UID: \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\") " Dec 01 20:49:52 crc kubenswrapper[4802]: I1201 20:49:52.990066 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-ovsdbserver-sb\") pod \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\" (UID: \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\") " Dec 01 20:49:52 crc kubenswrapper[4802]: I1201 20:49:52.990127 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-openstack-edpm-ipam\") pod \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\" (UID: \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\") " Dec 01 20:49:52 crc kubenswrapper[4802]: I1201 20:49:52.990296 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-config\") pod \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\" (UID: \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\") " Dec 01 20:49:52 crc kubenswrapper[4802]: I1201 20:49:52.990402 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-ovsdbserver-nb\") pod \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\" (UID: \"8dfcb60c-7f93-47a3-8343-2d99177de7f2\") " Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.079633 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dfcb60c-7f93-47a3-8343-2d99177de7f2-kube-api-access-494kb" (OuterVolumeSpecName: "kube-api-access-494kb") pod "8dfcb60c-7f93-47a3-8343-2d99177de7f2" (UID: "8dfcb60c-7f93-47a3-8343-2d99177de7f2"). InnerVolumeSpecName "kube-api-access-494kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.095416 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-494kb\" (UniqueName: \"kubernetes.io/projected/8dfcb60c-7f93-47a3-8343-2d99177de7f2-kube-api-access-494kb\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.108018 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8dfcb60c-7f93-47a3-8343-2d99177de7f2" (UID: "8dfcb60c-7f93-47a3-8343-2d99177de7f2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.115875 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "8dfcb60c-7f93-47a3-8343-2d99177de7f2" (UID: "8dfcb60c-7f93-47a3-8343-2d99177de7f2"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.159975 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-config" (OuterVolumeSpecName: "config") pod "8dfcb60c-7f93-47a3-8343-2d99177de7f2" (UID: "8dfcb60c-7f93-47a3-8343-2d99177de7f2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.160518 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8dfcb60c-7f93-47a3-8343-2d99177de7f2" (UID: "8dfcb60c-7f93-47a3-8343-2d99177de7f2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.160581 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8dfcb60c-7f93-47a3-8343-2d99177de7f2" (UID: "8dfcb60c-7f93-47a3-8343-2d99177de7f2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.199731 4802 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.199767 4802 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.199779 4802 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.199788 4802 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.199797 4802 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dfcb60c-7f93-47a3-8343-2d99177de7f2-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.740221 4802 generic.go:334] "Generic (PLEG): container finished" podID="c071511c-6a08-4631-b449-92a8a12d69f4" containerID="1aa285c8b297734162708ff008d5e2c123d15db7271330d5cff13f5c9172276c" exitCode=137 Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.740471 4802 generic.go:334] "Generic (PLEG): container finished" podID="c071511c-6a08-4631-b449-92a8a12d69f4" containerID="28b740a183951dbc8b038cecbaf80096e654f5d43f4b08737d83e53fb5fb5012" exitCode=137 Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.740558 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6468fb5467-v7gmh" event={"ID":"c071511c-6a08-4631-b449-92a8a12d69f4","Type":"ContainerDied","Data":"1aa285c8b297734162708ff008d5e2c123d15db7271330d5cff13f5c9172276c"} Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.740585 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6468fb5467-v7gmh" event={"ID":"c071511c-6a08-4631-b449-92a8a12d69f4","Type":"ContainerDied","Data":"28b740a183951dbc8b038cecbaf80096e654f5d43f4b08737d83e53fb5fb5012"} Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.744292 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"cde0d25e-888c-44b9-95a0-3bdae318a8b0","Type":"ContainerStarted","Data":"90667d5cf2dfc1a002223ef598580728e75635a148c37d3a73949889bc5e6a14"} Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.751709 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" event={"ID":"8dfcb60c-7f93-47a3-8343-2d99177de7f2","Type":"ContainerDied","Data":"5f8d9a63eb925a6bf2579a16a99ddfaa527411c9a620cbd843574c54b2a486ba"} Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.751749 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-4klz2" Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.751770 4802 scope.go:117] "RemoveContainer" containerID="3aadd1e1bb764aef360aa9ef7b8db58ba64fd44f8183f4f390db9c748c4948cd" Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.771483 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1","Type":"ContainerStarted","Data":"fb556dfffa44ff8a2ea9fcb1972e6d1fee1ad4d932115325b64d1f082a1db77e"} Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.774777 4802 generic.go:334] "Generic (PLEG): container finished" podID="1fdc24d0-b560-47a5-ad7d-d77d7581f7d9" containerID="53ea3a255cc779eca997cff9d75eac85026808c121724c77d6caf3cd0cdf46d6" exitCode=137 Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.774810 4802 generic.go:334] "Generic (PLEG): container finished" podID="1fdc24d0-b560-47a5-ad7d-d77d7581f7d9" containerID="6fc3e7343ae40f36a4591583dc681ac03d10d6e49fed88bf9049f12db037853a" exitCode=137 Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.774835 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-769b6f4d57-mlfdl" event={"ID":"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9","Type":"ContainerDied","Data":"53ea3a255cc779eca997cff9d75eac85026808c121724c77d6caf3cd0cdf46d6"} Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.774890 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-769b6f4d57-mlfdl" event={"ID":"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9","Type":"ContainerDied","Data":"6fc3e7343ae40f36a4591583dc681ac03d10d6e49fed88bf9049f12db037853a"} Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.792823 4802 scope.go:117] "RemoveContainer" containerID="81f3bc7904b921b1be4b2cb626fe9c00b75fda076bd77760295e66a8837e56d5" Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.798293 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.922248391 podStartE2EDuration="12.798266876s" podCreationTimestamp="2025-12-01 20:49:41 +0000 UTC" firstStartedPulling="2025-12-01 20:49:43.147586528 +0000 UTC m=+3204.710146169" lastFinishedPulling="2025-12-01 20:49:52.023605023 +0000 UTC m=+3213.586164654" observedRunningTime="2025-12-01 20:49:53.79328287 +0000 UTC m=+3215.355842531" watchObservedRunningTime="2025-12-01 20:49:53.798266876 +0000 UTC m=+3215.360826517" Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.821714 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-4klz2"] Dec 01 20:49:53 crc kubenswrapper[4802]: I1201 20:49:53.827533 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-4klz2"] Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.060371 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-769b6f4d57-mlfdl" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.200414 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6468fb5467-v7gmh" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.217906 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-scripts\") pod \"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9\" (UID: \"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9\") " Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.218044 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-logs\") pod \"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9\" (UID: \"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9\") " Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.218073 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-config-data\") pod \"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9\" (UID: \"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9\") " Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.218243 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq2qf\" (UniqueName: \"kubernetes.io/projected/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-kube-api-access-qq2qf\") pod \"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9\" (UID: \"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9\") " Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.218280 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-horizon-secret-key\") pod \"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9\" (UID: \"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9\") " Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.219128 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-logs" (OuterVolumeSpecName: "logs") pod "1fdc24d0-b560-47a5-ad7d-d77d7581f7d9" (UID: "1fdc24d0-b560-47a5-ad7d-d77d7581f7d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.225185 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "1fdc24d0-b560-47a5-ad7d-d77d7581f7d9" (UID: "1fdc24d0-b560-47a5-ad7d-d77d7581f7d9"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.226017 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-kube-api-access-qq2qf" (OuterVolumeSpecName: "kube-api-access-qq2qf") pod "1fdc24d0-b560-47a5-ad7d-d77d7581f7d9" (UID: "1fdc24d0-b560-47a5-ad7d-d77d7581f7d9"). InnerVolumeSpecName "kube-api-access-qq2qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.269434 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-config-data" (OuterVolumeSpecName: "config-data") pod "1fdc24d0-b560-47a5-ad7d-d77d7581f7d9" (UID: "1fdc24d0-b560-47a5-ad7d-d77d7581f7d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.309886 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-scripts" (OuterVolumeSpecName: "scripts") pod "1fdc24d0-b560-47a5-ad7d-d77d7581f7d9" (UID: "1fdc24d0-b560-47a5-ad7d-d77d7581f7d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.319784 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c071511c-6a08-4631-b449-92a8a12d69f4-scripts\") pod \"c071511c-6a08-4631-b449-92a8a12d69f4\" (UID: \"c071511c-6a08-4631-b449-92a8a12d69f4\") " Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.319870 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c071511c-6a08-4631-b449-92a8a12d69f4-horizon-secret-key\") pod \"c071511c-6a08-4631-b449-92a8a12d69f4\" (UID: \"c071511c-6a08-4631-b449-92a8a12d69f4\") " Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.319975 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c071511c-6a08-4631-b449-92a8a12d69f4-config-data\") pod \"c071511c-6a08-4631-b449-92a8a12d69f4\" (UID: \"c071511c-6a08-4631-b449-92a8a12d69f4\") " Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.320047 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxqkd\" (UniqueName: \"kubernetes.io/projected/c071511c-6a08-4631-b449-92a8a12d69f4-kube-api-access-xxqkd\") pod \"c071511c-6a08-4631-b449-92a8a12d69f4\" (UID: \"c071511c-6a08-4631-b449-92a8a12d69f4\") " Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.320102 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c071511c-6a08-4631-b449-92a8a12d69f4-logs\") pod \"c071511c-6a08-4631-b449-92a8a12d69f4\" (UID: \"c071511c-6a08-4631-b449-92a8a12d69f4\") " Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.320702 4802 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-logs\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.320730 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.320744 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq2qf\" (UniqueName: \"kubernetes.io/projected/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-kube-api-access-qq2qf\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.320757 4802 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.320769 4802 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.321441 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c071511c-6a08-4631-b449-92a8a12d69f4-logs" (OuterVolumeSpecName: "logs") pod "c071511c-6a08-4631-b449-92a8a12d69f4" (UID: "c071511c-6a08-4631-b449-92a8a12d69f4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.326073 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c071511c-6a08-4631-b449-92a8a12d69f4-kube-api-access-xxqkd" (OuterVolumeSpecName: "kube-api-access-xxqkd") pod "c071511c-6a08-4631-b449-92a8a12d69f4" (UID: "c071511c-6a08-4631-b449-92a8a12d69f4"). InnerVolumeSpecName "kube-api-access-xxqkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.334552 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c071511c-6a08-4631-b449-92a8a12d69f4-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c071511c-6a08-4631-b449-92a8a12d69f4" (UID: "c071511c-6a08-4631-b449-92a8a12d69f4"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.354572 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c071511c-6a08-4631-b449-92a8a12d69f4-scripts" (OuterVolumeSpecName: "scripts") pod "c071511c-6a08-4631-b449-92a8a12d69f4" (UID: "c071511c-6a08-4631-b449-92a8a12d69f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.364764 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c071511c-6a08-4631-b449-92a8a12d69f4-config-data" (OuterVolumeSpecName: "config-data") pod "c071511c-6a08-4631-b449-92a8a12d69f4" (UID: "c071511c-6a08-4631-b449-92a8a12d69f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.422353 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxqkd\" (UniqueName: \"kubernetes.io/projected/c071511c-6a08-4631-b449-92a8a12d69f4-kube-api-access-xxqkd\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.422394 4802 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c071511c-6a08-4631-b449-92a8a12d69f4-logs\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.422408 4802 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c071511c-6a08-4631-b449-92a8a12d69f4-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.422432 4802 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c071511c-6a08-4631-b449-92a8a12d69f4-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.422446 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c071511c-6a08-4631-b449-92a8a12d69f4-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.692249 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.704183 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-b69f75cb8-xrkks" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.734448 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dfcb60c-7f93-47a3-8343-2d99177de7f2" path="/var/lib/kubelet/pods/8dfcb60c-7f93-47a3-8343-2d99177de7f2/volumes" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.848692 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6468fb5467-v7gmh" event={"ID":"c071511c-6a08-4631-b449-92a8a12d69f4","Type":"ContainerDied","Data":"f0f30a383466134d7929b234e3c0de8c53db85ccee6a12099b3db3e603a63f73"} Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.848746 4802 scope.go:117] "RemoveContainer" containerID="1aa285c8b297734162708ff008d5e2c123d15db7271330d5cff13f5c9172276c" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.848896 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6468fb5467-v7gmh" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.879419 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6468fb5467-v7gmh"] Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.897963 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6468fb5467-v7gmh"] Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.899326 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1","Type":"ContainerStarted","Data":"ab8d620faebe0ba44e6ebbb064f0ba00e5ac3afd7b2c87b43eeba8403982f826"} Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.922630 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-769b6f4d57-mlfdl" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.923169 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-769b6f4d57-mlfdl" event={"ID":"1fdc24d0-b560-47a5-ad7d-d77d7581f7d9","Type":"ContainerDied","Data":"1d75c112d515826420f4abe6d5deb3ce3ba43b2c24e157c2a7287fd547f819e4"} Dec 01 20:49:54 crc kubenswrapper[4802]: E1201 20:49:54.924079 4802 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc071511c_6a08_4631_b449_92a8a12d69f4.slice/crio-f0f30a383466134d7929b234e3c0de8c53db85ccee6a12099b3db3e603a63f73\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fdc24d0_b560_47a5_ad7d_d77d7581f7d9.slice/crio-1d75c112d515826420f4abe6d5deb3ce3ba43b2c24e157c2a7287fd547f819e4\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fdc24d0_b560_47a5_ad7d_d77d7581f7d9.slice\": RecentStats: unable to find data in memory cache]" Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.972979 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-769b6f4d57-mlfdl"] Dec 01 20:49:54 crc kubenswrapper[4802]: I1201 20:49:54.991978 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-769b6f4d57-mlfdl"] Dec 01 20:49:55 crc kubenswrapper[4802]: I1201 20:49:55.043835 4802 scope.go:117] "RemoveContainer" containerID="28b740a183951dbc8b038cecbaf80096e654f5d43f4b08737d83e53fb5fb5012" Dec 01 20:49:55 crc kubenswrapper[4802]: I1201 20:49:55.091279 4802 scope.go:117] "RemoveContainer" containerID="53ea3a255cc779eca997cff9d75eac85026808c121724c77d6caf3cd0cdf46d6" Dec 01 20:49:55 crc kubenswrapper[4802]: I1201 20:49:55.262598 4802 scope.go:117] "RemoveContainer" containerID="6fc3e7343ae40f36a4591583dc681ac03d10d6e49fed88bf9049f12db037853a" Dec 01 20:49:56 crc kubenswrapper[4802]: I1201 20:49:56.066490 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:49:56 crc kubenswrapper[4802]: I1201 20:49:56.067140 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8d1dda46-a58e-449a-b955-fe29d42e657b" containerName="ceilometer-central-agent" containerID="cri-o://eda06ba1783153159cc3aef59e91124ea4a804d55ded207da2e288949bc96a3d" gracePeriod=30 Dec 01 20:49:56 crc kubenswrapper[4802]: I1201 20:49:56.067177 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8d1dda46-a58e-449a-b955-fe29d42e657b" containerName="ceilometer-notification-agent" containerID="cri-o://5d16d1356d437d9b54877a0ee0580037560b09884fa603923a333f8f50593ebb" gracePeriod=30 Dec 01 20:49:56 crc kubenswrapper[4802]: I1201 20:49:56.067222 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8d1dda46-a58e-449a-b955-fe29d42e657b" containerName="sg-core" containerID="cri-o://5a83d2ae33500dd1c207401a9de44606eb613a1bc6b09d691bd98fe298250869" gracePeriod=30 Dec 01 20:49:56 crc kubenswrapper[4802]: I1201 20:49:56.067177 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8d1dda46-a58e-449a-b955-fe29d42e657b" containerName="proxy-httpd" containerID="cri-o://9706577de228b94ec0ad1901203cbd57c1729e2c9fbedfa85764d8e1d6bbaa8f" gracePeriod=30 Dec 01 20:49:56 crc kubenswrapper[4802]: I1201 20:49:56.735212 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fdc24d0-b560-47a5-ad7d-d77d7581f7d9" path="/var/lib/kubelet/pods/1fdc24d0-b560-47a5-ad7d-d77d7581f7d9/volumes" Dec 01 20:49:56 crc kubenswrapper[4802]: I1201 20:49:56.736107 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c071511c-6a08-4631-b449-92a8a12d69f4" path="/var/lib/kubelet/pods/c071511c-6a08-4631-b449-92a8a12d69f4/volumes" Dec 01 20:49:56 crc kubenswrapper[4802]: I1201 20:49:56.953093 4802 generic.go:334] "Generic (PLEG): container finished" podID="8d1dda46-a58e-449a-b955-fe29d42e657b" containerID="9706577de228b94ec0ad1901203cbd57c1729e2c9fbedfa85764d8e1d6bbaa8f" exitCode=0 Dec 01 20:49:56 crc kubenswrapper[4802]: I1201 20:49:56.953150 4802 generic.go:334] "Generic (PLEG): container finished" podID="8d1dda46-a58e-449a-b955-fe29d42e657b" containerID="5a83d2ae33500dd1c207401a9de44606eb613a1bc6b09d691bd98fe298250869" exitCode=2 Dec 01 20:49:56 crc kubenswrapper[4802]: I1201 20:49:56.953163 4802 generic.go:334] "Generic (PLEG): container finished" podID="8d1dda46-a58e-449a-b955-fe29d42e657b" containerID="eda06ba1783153159cc3aef59e91124ea4a804d55ded207da2e288949bc96a3d" exitCode=0 Dec 01 20:49:56 crc kubenswrapper[4802]: I1201 20:49:56.953158 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d1dda46-a58e-449a-b955-fe29d42e657b","Type":"ContainerDied","Data":"9706577de228b94ec0ad1901203cbd57c1729e2c9fbedfa85764d8e1d6bbaa8f"} Dec 01 20:49:56 crc kubenswrapper[4802]: I1201 20:49:56.953225 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d1dda46-a58e-449a-b955-fe29d42e657b","Type":"ContainerDied","Data":"5a83d2ae33500dd1c207401a9de44606eb613a1bc6b09d691bd98fe298250869"} Dec 01 20:49:56 crc kubenswrapper[4802]: I1201 20:49:56.953237 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d1dda46-a58e-449a-b955-fe29d42e657b","Type":"ContainerDied","Data":"eda06ba1783153159cc3aef59e91124ea4a804d55ded207da2e288949bc96a3d"} Dec 01 20:49:57 crc kubenswrapper[4802]: I1201 20:49:57.268787 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:49:57 crc kubenswrapper[4802]: I1201 20:49:57.273908 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-b69f75cb8-xrkks" Dec 01 20:49:57 crc kubenswrapper[4802]: I1201 20:49:57.409910 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-58f6ccd776-gr6kp"] Dec 01 20:49:57 crc kubenswrapper[4802]: I1201 20:49:57.961339 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-58f6ccd776-gr6kp" podUID="f2a61354-70f8-4e95-ab30-6e1d90128879" containerName="horizon-log" containerID="cri-o://de897dcdcc75080e2b98ab343ee6181eade9fc23a8da10a7e641218df7f93229" gracePeriod=30 Dec 01 20:49:57 crc kubenswrapper[4802]: I1201 20:49:57.961435 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-58f6ccd776-gr6kp" podUID="f2a61354-70f8-4e95-ab30-6e1d90128879" containerName="horizon" containerID="cri-o://232156007ccc94a24dde9aeeb757f17c9242d6806240397fa3b536de5424c9e1" gracePeriod=30 Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.502758 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ftfsb"] Dec 01 20:50:00 crc kubenswrapper[4802]: E1201 20:50:00.504132 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c071511c-6a08-4631-b449-92a8a12d69f4" containerName="horizon" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.504150 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="c071511c-6a08-4631-b449-92a8a12d69f4" containerName="horizon" Dec 01 20:50:00 crc kubenswrapper[4802]: E1201 20:50:00.504165 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dfcb60c-7f93-47a3-8343-2d99177de7f2" containerName="init" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.504171 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dfcb60c-7f93-47a3-8343-2d99177de7f2" containerName="init" Dec 01 20:50:00 crc kubenswrapper[4802]: E1201 20:50:00.504210 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dfcb60c-7f93-47a3-8343-2d99177de7f2" containerName="dnsmasq-dns" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.504216 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dfcb60c-7f93-47a3-8343-2d99177de7f2" containerName="dnsmasq-dns" Dec 01 20:50:00 crc kubenswrapper[4802]: E1201 20:50:00.504231 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fdc24d0-b560-47a5-ad7d-d77d7581f7d9" containerName="horizon-log" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.504236 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fdc24d0-b560-47a5-ad7d-d77d7581f7d9" containerName="horizon-log" Dec 01 20:50:00 crc kubenswrapper[4802]: E1201 20:50:00.504252 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fdc24d0-b560-47a5-ad7d-d77d7581f7d9" containerName="horizon" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.504258 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fdc24d0-b560-47a5-ad7d-d77d7581f7d9" containerName="horizon" Dec 01 20:50:00 crc kubenswrapper[4802]: E1201 20:50:00.504271 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c071511c-6a08-4631-b449-92a8a12d69f4" containerName="horizon-log" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.504283 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="c071511c-6a08-4631-b449-92a8a12d69f4" containerName="horizon-log" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.504514 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="c071511c-6a08-4631-b449-92a8a12d69f4" containerName="horizon" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.504539 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="c071511c-6a08-4631-b449-92a8a12d69f4" containerName="horizon-log" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.504552 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dfcb60c-7f93-47a3-8343-2d99177de7f2" containerName="dnsmasq-dns" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.504570 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fdc24d0-b560-47a5-ad7d-d77d7581f7d9" containerName="horizon-log" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.504579 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fdc24d0-b560-47a5-ad7d-d77d7581f7d9" containerName="horizon" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.506377 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ftfsb" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.518317 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ftfsb"] Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.676138 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.696885 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whnvl\" (UniqueName: \"kubernetes.io/projected/9a60b7b0-67a6-4c89-9912-ec1cd0b3d405-kube-api-access-whnvl\") pod \"redhat-marketplace-ftfsb\" (UID: \"9a60b7b0-67a6-4c89-9912-ec1cd0b3d405\") " pod="openshift-marketplace/redhat-marketplace-ftfsb" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.696947 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a60b7b0-67a6-4c89-9912-ec1cd0b3d405-catalog-content\") pod \"redhat-marketplace-ftfsb\" (UID: \"9a60b7b0-67a6-4c89-9912-ec1cd0b3d405\") " pod="openshift-marketplace/redhat-marketplace-ftfsb" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.697112 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a60b7b0-67a6-4c89-9912-ec1cd0b3d405-utilities\") pod \"redhat-marketplace-ftfsb\" (UID: \"9a60b7b0-67a6-4c89-9912-ec1cd0b3d405\") " pod="openshift-marketplace/redhat-marketplace-ftfsb" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.798606 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-scripts\") pod \"8d1dda46-a58e-449a-b955-fe29d42e657b\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.799050 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-config-data\") pod \"8d1dda46-a58e-449a-b955-fe29d42e657b\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.799081 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-ceilometer-tls-certs\") pod \"8d1dda46-a58e-449a-b955-fe29d42e657b\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.799112 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7dcv\" (UniqueName: \"kubernetes.io/projected/8d1dda46-a58e-449a-b955-fe29d42e657b-kube-api-access-j7dcv\") pod \"8d1dda46-a58e-449a-b955-fe29d42e657b\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.799150 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-sg-core-conf-yaml\") pod \"8d1dda46-a58e-449a-b955-fe29d42e657b\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.799227 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d1dda46-a58e-449a-b955-fe29d42e657b-run-httpd\") pod \"8d1dda46-a58e-449a-b955-fe29d42e657b\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.799291 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d1dda46-a58e-449a-b955-fe29d42e657b-log-httpd\") pod \"8d1dda46-a58e-449a-b955-fe29d42e657b\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.799362 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-combined-ca-bundle\") pod \"8d1dda46-a58e-449a-b955-fe29d42e657b\" (UID: \"8d1dda46-a58e-449a-b955-fe29d42e657b\") " Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.799868 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whnvl\" (UniqueName: \"kubernetes.io/projected/9a60b7b0-67a6-4c89-9912-ec1cd0b3d405-kube-api-access-whnvl\") pod \"redhat-marketplace-ftfsb\" (UID: \"9a60b7b0-67a6-4c89-9912-ec1cd0b3d405\") " pod="openshift-marketplace/redhat-marketplace-ftfsb" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.799908 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a60b7b0-67a6-4c89-9912-ec1cd0b3d405-catalog-content\") pod \"redhat-marketplace-ftfsb\" (UID: \"9a60b7b0-67a6-4c89-9912-ec1cd0b3d405\") " pod="openshift-marketplace/redhat-marketplace-ftfsb" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.799996 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a60b7b0-67a6-4c89-9912-ec1cd0b3d405-utilities\") pod \"redhat-marketplace-ftfsb\" (UID: \"9a60b7b0-67a6-4c89-9912-ec1cd0b3d405\") " pod="openshift-marketplace/redhat-marketplace-ftfsb" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.802608 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d1dda46-a58e-449a-b955-fe29d42e657b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8d1dda46-a58e-449a-b955-fe29d42e657b" (UID: "8d1dda46-a58e-449a-b955-fe29d42e657b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.803067 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d1dda46-a58e-449a-b955-fe29d42e657b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8d1dda46-a58e-449a-b955-fe29d42e657b" (UID: "8d1dda46-a58e-449a-b955-fe29d42e657b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.806304 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-scripts" (OuterVolumeSpecName: "scripts") pod "8d1dda46-a58e-449a-b955-fe29d42e657b" (UID: "8d1dda46-a58e-449a-b955-fe29d42e657b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.806330 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d1dda46-a58e-449a-b955-fe29d42e657b-kube-api-access-j7dcv" (OuterVolumeSpecName: "kube-api-access-j7dcv") pod "8d1dda46-a58e-449a-b955-fe29d42e657b" (UID: "8d1dda46-a58e-449a-b955-fe29d42e657b"). InnerVolumeSpecName "kube-api-access-j7dcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.812807 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a60b7b0-67a6-4c89-9912-ec1cd0b3d405-utilities\") pod \"redhat-marketplace-ftfsb\" (UID: \"9a60b7b0-67a6-4c89-9912-ec1cd0b3d405\") " pod="openshift-marketplace/redhat-marketplace-ftfsb" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.812894 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a60b7b0-67a6-4c89-9912-ec1cd0b3d405-catalog-content\") pod \"redhat-marketplace-ftfsb\" (UID: \"9a60b7b0-67a6-4c89-9912-ec1cd0b3d405\") " pod="openshift-marketplace/redhat-marketplace-ftfsb" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.831090 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whnvl\" (UniqueName: \"kubernetes.io/projected/9a60b7b0-67a6-4c89-9912-ec1cd0b3d405-kube-api-access-whnvl\") pod \"redhat-marketplace-ftfsb\" (UID: \"9a60b7b0-67a6-4c89-9912-ec1cd0b3d405\") " pod="openshift-marketplace/redhat-marketplace-ftfsb" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.850576 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ftfsb" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.869745 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8d1dda46-a58e-449a-b955-fe29d42e657b" (UID: "8d1dda46-a58e-449a-b955-fe29d42e657b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.907911 4802 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d1dda46-a58e-449a-b955-fe29d42e657b-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.907944 4802 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d1dda46-a58e-449a-b955-fe29d42e657b-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.907953 4802 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.907962 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7dcv\" (UniqueName: \"kubernetes.io/projected/8d1dda46-a58e-449a-b955-fe29d42e657b-kube-api-access-j7dcv\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.907994 4802 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.938498 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8d1dda46-a58e-449a-b955-fe29d42e657b" (UID: "8d1dda46-a58e-449a-b955-fe29d42e657b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:50:00 crc kubenswrapper[4802]: I1201 20:50:00.956750 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d1dda46-a58e-449a-b955-fe29d42e657b" (UID: "8d1dda46-a58e-449a-b955-fe29d42e657b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.009674 4802 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.009698 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.011016 4802 generic.go:334] "Generic (PLEG): container finished" podID="8d1dda46-a58e-449a-b955-fe29d42e657b" containerID="5d16d1356d437d9b54877a0ee0580037560b09884fa603923a333f8f50593ebb" exitCode=0 Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.011061 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d1dda46-a58e-449a-b955-fe29d42e657b","Type":"ContainerDied","Data":"5d16d1356d437d9b54877a0ee0580037560b09884fa603923a333f8f50593ebb"} Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.011094 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d1dda46-a58e-449a-b955-fe29d42e657b","Type":"ContainerDied","Data":"9514e36a9f561c1a6ad518e1c5a59f6ced9540c3e26c2c6e299f2a83b0aa05be"} Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.011116 4802 scope.go:117] "RemoveContainer" containerID="9706577de228b94ec0ad1901203cbd57c1729e2c9fbedfa85764d8e1d6bbaa8f" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.011325 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.045546 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-config-data" (OuterVolumeSpecName: "config-data") pod "8d1dda46-a58e-449a-b955-fe29d42e657b" (UID: "8d1dda46-a58e-449a-b955-fe29d42e657b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.080218 4802 scope.go:117] "RemoveContainer" containerID="5a83d2ae33500dd1c207401a9de44606eb613a1bc6b09d691bd98fe298250869" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.105908 4802 scope.go:117] "RemoveContainer" containerID="5d16d1356d437d9b54877a0ee0580037560b09884fa603923a333f8f50593ebb" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.111719 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d1dda46-a58e-449a-b955-fe29d42e657b-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.147505 4802 scope.go:117] "RemoveContainer" containerID="eda06ba1783153159cc3aef59e91124ea4a804d55ded207da2e288949bc96a3d" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.201899 4802 scope.go:117] "RemoveContainer" containerID="9706577de228b94ec0ad1901203cbd57c1729e2c9fbedfa85764d8e1d6bbaa8f" Dec 01 20:50:01 crc kubenswrapper[4802]: E1201 20:50:01.208498 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9706577de228b94ec0ad1901203cbd57c1729e2c9fbedfa85764d8e1d6bbaa8f\": container with ID starting with 9706577de228b94ec0ad1901203cbd57c1729e2c9fbedfa85764d8e1d6bbaa8f not found: ID does not exist" containerID="9706577de228b94ec0ad1901203cbd57c1729e2c9fbedfa85764d8e1d6bbaa8f" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.208559 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9706577de228b94ec0ad1901203cbd57c1729e2c9fbedfa85764d8e1d6bbaa8f"} err="failed to get container status \"9706577de228b94ec0ad1901203cbd57c1729e2c9fbedfa85764d8e1d6bbaa8f\": rpc error: code = NotFound desc = could not find container \"9706577de228b94ec0ad1901203cbd57c1729e2c9fbedfa85764d8e1d6bbaa8f\": container with ID starting with 9706577de228b94ec0ad1901203cbd57c1729e2c9fbedfa85764d8e1d6bbaa8f not found: ID does not exist" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.208600 4802 scope.go:117] "RemoveContainer" containerID="5a83d2ae33500dd1c207401a9de44606eb613a1bc6b09d691bd98fe298250869" Dec 01 20:50:01 crc kubenswrapper[4802]: E1201 20:50:01.209855 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a83d2ae33500dd1c207401a9de44606eb613a1bc6b09d691bd98fe298250869\": container with ID starting with 5a83d2ae33500dd1c207401a9de44606eb613a1bc6b09d691bd98fe298250869 not found: ID does not exist" containerID="5a83d2ae33500dd1c207401a9de44606eb613a1bc6b09d691bd98fe298250869" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.209914 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a83d2ae33500dd1c207401a9de44606eb613a1bc6b09d691bd98fe298250869"} err="failed to get container status \"5a83d2ae33500dd1c207401a9de44606eb613a1bc6b09d691bd98fe298250869\": rpc error: code = NotFound desc = could not find container \"5a83d2ae33500dd1c207401a9de44606eb613a1bc6b09d691bd98fe298250869\": container with ID starting with 5a83d2ae33500dd1c207401a9de44606eb613a1bc6b09d691bd98fe298250869 not found: ID does not exist" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.209934 4802 scope.go:117] "RemoveContainer" containerID="5d16d1356d437d9b54877a0ee0580037560b09884fa603923a333f8f50593ebb" Dec 01 20:50:01 crc kubenswrapper[4802]: E1201 20:50:01.210448 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d16d1356d437d9b54877a0ee0580037560b09884fa603923a333f8f50593ebb\": container with ID starting with 5d16d1356d437d9b54877a0ee0580037560b09884fa603923a333f8f50593ebb not found: ID does not exist" containerID="5d16d1356d437d9b54877a0ee0580037560b09884fa603923a333f8f50593ebb" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.210480 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d16d1356d437d9b54877a0ee0580037560b09884fa603923a333f8f50593ebb"} err="failed to get container status \"5d16d1356d437d9b54877a0ee0580037560b09884fa603923a333f8f50593ebb\": rpc error: code = NotFound desc = could not find container \"5d16d1356d437d9b54877a0ee0580037560b09884fa603923a333f8f50593ebb\": container with ID starting with 5d16d1356d437d9b54877a0ee0580037560b09884fa603923a333f8f50593ebb not found: ID does not exist" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.210498 4802 scope.go:117] "RemoveContainer" containerID="eda06ba1783153159cc3aef59e91124ea4a804d55ded207da2e288949bc96a3d" Dec 01 20:50:01 crc kubenswrapper[4802]: E1201 20:50:01.210973 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eda06ba1783153159cc3aef59e91124ea4a804d55ded207da2e288949bc96a3d\": container with ID starting with eda06ba1783153159cc3aef59e91124ea4a804d55ded207da2e288949bc96a3d not found: ID does not exist" containerID="eda06ba1783153159cc3aef59e91124ea4a804d55ded207da2e288949bc96a3d" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.211008 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eda06ba1783153159cc3aef59e91124ea4a804d55ded207da2e288949bc96a3d"} err="failed to get container status \"eda06ba1783153159cc3aef59e91124ea4a804d55ded207da2e288949bc96a3d\": rpc error: code = NotFound desc = could not find container \"eda06ba1783153159cc3aef59e91124ea4a804d55ded207da2e288949bc96a3d\": container with ID starting with eda06ba1783153159cc3aef59e91124ea4a804d55ded207da2e288949bc96a3d not found: ID does not exist" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.374801 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.382723 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.392737 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:50:01 crc kubenswrapper[4802]: E1201 20:50:01.393630 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d1dda46-a58e-449a-b955-fe29d42e657b" containerName="ceilometer-notification-agent" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.393669 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d1dda46-a58e-449a-b955-fe29d42e657b" containerName="ceilometer-notification-agent" Dec 01 20:50:01 crc kubenswrapper[4802]: E1201 20:50:01.393728 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d1dda46-a58e-449a-b955-fe29d42e657b" containerName="ceilometer-central-agent" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.393746 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d1dda46-a58e-449a-b955-fe29d42e657b" containerName="ceilometer-central-agent" Dec 01 20:50:01 crc kubenswrapper[4802]: E1201 20:50:01.393781 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d1dda46-a58e-449a-b955-fe29d42e657b" containerName="proxy-httpd" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.393800 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d1dda46-a58e-449a-b955-fe29d42e657b" containerName="proxy-httpd" Dec 01 20:50:01 crc kubenswrapper[4802]: E1201 20:50:01.393822 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d1dda46-a58e-449a-b955-fe29d42e657b" containerName="sg-core" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.393838 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d1dda46-a58e-449a-b955-fe29d42e657b" containerName="sg-core" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.394394 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d1dda46-a58e-449a-b955-fe29d42e657b" containerName="ceilometer-central-agent" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.394455 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d1dda46-a58e-449a-b955-fe29d42e657b" containerName="ceilometer-notification-agent" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.394484 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d1dda46-a58e-449a-b955-fe29d42e657b" containerName="sg-core" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.394533 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d1dda46-a58e-449a-b955-fe29d42e657b" containerName="proxy-httpd" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.398083 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.400251 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.400575 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.400884 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.400903 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.447747 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ftfsb"] Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.524100 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdefd2ec-84d7-4e92-adeb-969ad52e35b6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bdefd2ec-84d7-4e92-adeb-969ad52e35b6\") " pod="openstack/ceilometer-0" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.524450 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjgqw\" (UniqueName: \"kubernetes.io/projected/bdefd2ec-84d7-4e92-adeb-969ad52e35b6-kube-api-access-bjgqw\") pod \"ceilometer-0\" (UID: \"bdefd2ec-84d7-4e92-adeb-969ad52e35b6\") " pod="openstack/ceilometer-0" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.524487 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdefd2ec-84d7-4e92-adeb-969ad52e35b6-config-data\") pod \"ceilometer-0\" (UID: \"bdefd2ec-84d7-4e92-adeb-969ad52e35b6\") " pod="openstack/ceilometer-0" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.524517 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdefd2ec-84d7-4e92-adeb-969ad52e35b6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bdefd2ec-84d7-4e92-adeb-969ad52e35b6\") " pod="openstack/ceilometer-0" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.524585 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdefd2ec-84d7-4e92-adeb-969ad52e35b6-run-httpd\") pod \"ceilometer-0\" (UID: \"bdefd2ec-84d7-4e92-adeb-969ad52e35b6\") " pod="openstack/ceilometer-0" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.524628 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdefd2ec-84d7-4e92-adeb-969ad52e35b6-scripts\") pod \"ceilometer-0\" (UID: \"bdefd2ec-84d7-4e92-adeb-969ad52e35b6\") " pod="openstack/ceilometer-0" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.524656 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bdefd2ec-84d7-4e92-adeb-969ad52e35b6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bdefd2ec-84d7-4e92-adeb-969ad52e35b6\") " pod="openstack/ceilometer-0" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.524688 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdefd2ec-84d7-4e92-adeb-969ad52e35b6-log-httpd\") pod \"ceilometer-0\" (UID: \"bdefd2ec-84d7-4e92-adeb-969ad52e35b6\") " pod="openstack/ceilometer-0" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.627810 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdefd2ec-84d7-4e92-adeb-969ad52e35b6-scripts\") pod \"ceilometer-0\" (UID: \"bdefd2ec-84d7-4e92-adeb-969ad52e35b6\") " pod="openstack/ceilometer-0" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.627887 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bdefd2ec-84d7-4e92-adeb-969ad52e35b6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bdefd2ec-84d7-4e92-adeb-969ad52e35b6\") " pod="openstack/ceilometer-0" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.627947 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdefd2ec-84d7-4e92-adeb-969ad52e35b6-log-httpd\") pod \"ceilometer-0\" (UID: \"bdefd2ec-84d7-4e92-adeb-969ad52e35b6\") " pod="openstack/ceilometer-0" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.627992 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdefd2ec-84d7-4e92-adeb-969ad52e35b6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bdefd2ec-84d7-4e92-adeb-969ad52e35b6\") " pod="openstack/ceilometer-0" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.628065 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjgqw\" (UniqueName: \"kubernetes.io/projected/bdefd2ec-84d7-4e92-adeb-969ad52e35b6-kube-api-access-bjgqw\") pod \"ceilometer-0\" (UID: \"bdefd2ec-84d7-4e92-adeb-969ad52e35b6\") " pod="openstack/ceilometer-0" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.628104 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdefd2ec-84d7-4e92-adeb-969ad52e35b6-config-data\") pod \"ceilometer-0\" (UID: \"bdefd2ec-84d7-4e92-adeb-969ad52e35b6\") " pod="openstack/ceilometer-0" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.628131 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdefd2ec-84d7-4e92-adeb-969ad52e35b6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bdefd2ec-84d7-4e92-adeb-969ad52e35b6\") " pod="openstack/ceilometer-0" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.628210 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdefd2ec-84d7-4e92-adeb-969ad52e35b6-run-httpd\") pod \"ceilometer-0\" (UID: \"bdefd2ec-84d7-4e92-adeb-969ad52e35b6\") " pod="openstack/ceilometer-0" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.629015 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdefd2ec-84d7-4e92-adeb-969ad52e35b6-run-httpd\") pod \"ceilometer-0\" (UID: \"bdefd2ec-84d7-4e92-adeb-969ad52e35b6\") " pod="openstack/ceilometer-0" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.629702 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdefd2ec-84d7-4e92-adeb-969ad52e35b6-log-httpd\") pod \"ceilometer-0\" (UID: \"bdefd2ec-84d7-4e92-adeb-969ad52e35b6\") " pod="openstack/ceilometer-0" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.635958 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bdefd2ec-84d7-4e92-adeb-969ad52e35b6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bdefd2ec-84d7-4e92-adeb-969ad52e35b6\") " pod="openstack/ceilometer-0" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.638937 4802 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-58f6ccd776-gr6kp" podUID="f2a61354-70f8-4e95-ab30-6e1d90128879" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.239:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.239:8443: connect: connection refused" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.639151 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdefd2ec-84d7-4e92-adeb-969ad52e35b6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bdefd2ec-84d7-4e92-adeb-969ad52e35b6\") " pod="openstack/ceilometer-0" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.641047 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdefd2ec-84d7-4e92-adeb-969ad52e35b6-config-data\") pod \"ceilometer-0\" (UID: \"bdefd2ec-84d7-4e92-adeb-969ad52e35b6\") " pod="openstack/ceilometer-0" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.643658 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdefd2ec-84d7-4e92-adeb-969ad52e35b6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bdefd2ec-84d7-4e92-adeb-969ad52e35b6\") " pod="openstack/ceilometer-0" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.644429 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdefd2ec-84d7-4e92-adeb-969ad52e35b6-scripts\") pod \"ceilometer-0\" (UID: \"bdefd2ec-84d7-4e92-adeb-969ad52e35b6\") " pod="openstack/ceilometer-0" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.648148 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjgqw\" (UniqueName: \"kubernetes.io/projected/bdefd2ec-84d7-4e92-adeb-969ad52e35b6-kube-api-access-bjgqw\") pod \"ceilometer-0\" (UID: \"bdefd2ec-84d7-4e92-adeb-969ad52e35b6\") " pod="openstack/ceilometer-0" Dec 01 20:50:01 crc kubenswrapper[4802]: I1201 20:50:01.716850 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 20:50:02 crc kubenswrapper[4802]: I1201 20:50:02.027937 4802 generic.go:334] "Generic (PLEG): container finished" podID="9a60b7b0-67a6-4c89-9912-ec1cd0b3d405" containerID="54b762a1d7d312eee88013e6da4c8c29da51e9a056efe7bbd8d90523510eff06" exitCode=0 Dec 01 20:50:02 crc kubenswrapper[4802]: I1201 20:50:02.027988 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ftfsb" event={"ID":"9a60b7b0-67a6-4c89-9912-ec1cd0b3d405","Type":"ContainerDied","Data":"54b762a1d7d312eee88013e6da4c8c29da51e9a056efe7bbd8d90523510eff06"} Dec 01 20:50:02 crc kubenswrapper[4802]: I1201 20:50:02.028626 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ftfsb" event={"ID":"9a60b7b0-67a6-4c89-9912-ec1cd0b3d405","Type":"ContainerStarted","Data":"1d731a346ee2328a426d3617e01d990f14d4056b9b276e101572bfa81f400693"} Dec 01 20:50:02 crc kubenswrapper[4802]: I1201 20:50:02.036626 4802 generic.go:334] "Generic (PLEG): container finished" podID="f2a61354-70f8-4e95-ab30-6e1d90128879" containerID="232156007ccc94a24dde9aeeb757f17c9242d6806240397fa3b536de5424c9e1" exitCode=0 Dec 01 20:50:02 crc kubenswrapper[4802]: I1201 20:50:02.036723 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58f6ccd776-gr6kp" event={"ID":"f2a61354-70f8-4e95-ab30-6e1d90128879","Type":"ContainerDied","Data":"232156007ccc94a24dde9aeeb757f17c9242d6806240397fa3b536de5424c9e1"} Dec 01 20:50:02 crc kubenswrapper[4802]: I1201 20:50:02.130344 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Dec 01 20:50:02 crc kubenswrapper[4802]: I1201 20:50:02.155074 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 20:50:02 crc kubenswrapper[4802]: W1201 20:50:02.159473 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdefd2ec_84d7_4e92_adeb_969ad52e35b6.slice/crio-d413f76fabba3a6ee5aa8ff6e0c27026540ccb8064f04922b6eff6dbacf14743 WatchSource:0}: Error finding container d413f76fabba3a6ee5aa8ff6e0c27026540ccb8064f04922b6eff6dbacf14743: Status 404 returned error can't find the container with id d413f76fabba3a6ee5aa8ff6e0c27026540ccb8064f04922b6eff6dbacf14743 Dec 01 20:50:02 crc kubenswrapper[4802]: I1201 20:50:02.734140 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d1dda46-a58e-449a-b955-fe29d42e657b" path="/var/lib/kubelet/pods/8d1dda46-a58e-449a-b955-fe29d42e657b/volumes" Dec 01 20:50:03 crc kubenswrapper[4802]: I1201 20:50:03.076780 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdefd2ec-84d7-4e92-adeb-969ad52e35b6","Type":"ContainerStarted","Data":"d413f76fabba3a6ee5aa8ff6e0c27026540ccb8064f04922b6eff6dbacf14743"} Dec 01 20:50:03 crc kubenswrapper[4802]: I1201 20:50:03.720299 4802 scope.go:117] "RemoveContainer" containerID="538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098" Dec 01 20:50:04 crc kubenswrapper[4802]: I1201 20:50:04.094185 4802 generic.go:334] "Generic (PLEG): container finished" podID="9a60b7b0-67a6-4c89-9912-ec1cd0b3d405" containerID="f5dc41f792390250abb478532bfe6bc4d9bafc8d18e6fdc7126c00b4d0e1c6ce" exitCode=0 Dec 01 20:50:04 crc kubenswrapper[4802]: I1201 20:50:04.094386 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ftfsb" event={"ID":"9a60b7b0-67a6-4c89-9912-ec1cd0b3d405","Type":"ContainerDied","Data":"f5dc41f792390250abb478532bfe6bc4d9bafc8d18e6fdc7126c00b4d0e1c6ce"} Dec 01 20:50:04 crc kubenswrapper[4802]: I1201 20:50:04.101421 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerStarted","Data":"f8b3c5182085db2dc50eb18e66872b4d95d4989747b298b4b3a4f8c464087706"} Dec 01 20:50:04 crc kubenswrapper[4802]: I1201 20:50:04.103750 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdefd2ec-84d7-4e92-adeb-969ad52e35b6","Type":"ContainerStarted","Data":"a07d0990b8b73ba795dca7e3906dfc8e8b6ef8b9610ceefdff2512c4f2742ace"} Dec 01 20:50:04 crc kubenswrapper[4802]: I1201 20:50:04.103879 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdefd2ec-84d7-4e92-adeb-969ad52e35b6","Type":"ContainerStarted","Data":"3426335adeb622cd7e4b190f57bec3430d5fc0126a0fa9545c6f9f2b0f9b2d31"} Dec 01 20:50:04 crc kubenswrapper[4802]: I1201 20:50:04.173859 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Dec 01 20:50:04 crc kubenswrapper[4802]: I1201 20:50:04.223761 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Dec 01 20:50:04 crc kubenswrapper[4802]: I1201 20:50:04.342942 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Dec 01 20:50:04 crc kubenswrapper[4802]: I1201 20:50:04.414242 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Dec 01 20:50:05 crc kubenswrapper[4802]: I1201 20:50:05.125754 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdefd2ec-84d7-4e92-adeb-969ad52e35b6","Type":"ContainerStarted","Data":"eca01782098a8f81652e6ddbcb39ea0d3d77fdb8a795b7804ce859f919d649ad"} Dec 01 20:50:05 crc kubenswrapper[4802]: I1201 20:50:05.132737 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="4c800d9a-2aa1-49df-a30b-23f36117381b" containerName="manila-scheduler" containerID="cri-o://9a7f21b2c785699936a502845d10ba53cad44de5790b6ec6da2d0a8ade617154" gracePeriod=30 Dec 01 20:50:05 crc kubenswrapper[4802]: I1201 20:50:05.133156 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="4c800d9a-2aa1-49df-a30b-23f36117381b" containerName="probe" containerID="cri-o://92edfdff569346e567dd26f9a973676083700d25c40a2bc4cd054f190ba85cd9" gracePeriod=30 Dec 01 20:50:05 crc kubenswrapper[4802]: I1201 20:50:05.133341 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ftfsb" event={"ID":"9a60b7b0-67a6-4c89-9912-ec1cd0b3d405","Type":"ContainerStarted","Data":"96925e431f760a148f399bbec47a2cd565e0c7f6d3071feec554ee50c308213a"} Dec 01 20:50:05 crc kubenswrapper[4802]: I1201 20:50:05.134631 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1" containerName="manila-share" containerID="cri-o://fb556dfffa44ff8a2ea9fcb1972e6d1fee1ad4d932115325b64d1f082a1db77e" gracePeriod=30 Dec 01 20:50:05 crc kubenswrapper[4802]: I1201 20:50:05.134717 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1" containerName="probe" containerID="cri-o://ab8d620faebe0ba44e6ebbb064f0ba00e5ac3afd7b2c87b43eeba8403982f826" gracePeriod=30 Dec 01 20:50:05 crc kubenswrapper[4802]: I1201 20:50:05.174834 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ftfsb" podStartSLOduration=2.680814766 podStartE2EDuration="5.174814028s" podCreationTimestamp="2025-12-01 20:50:00 +0000 UTC" firstStartedPulling="2025-12-01 20:50:02.03029536 +0000 UTC m=+3223.592855001" lastFinishedPulling="2025-12-01 20:50:04.524294622 +0000 UTC m=+3226.086854263" observedRunningTime="2025-12-01 20:50:05.170339118 +0000 UTC m=+3226.732898759" watchObservedRunningTime="2025-12-01 20:50:05.174814028 +0000 UTC m=+3226.737373669" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.133662 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.142516 4802 generic.go:334] "Generic (PLEG): container finished" podID="4c800d9a-2aa1-49df-a30b-23f36117381b" containerID="92edfdff569346e567dd26f9a973676083700d25c40a2bc4cd054f190ba85cd9" exitCode=0 Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.142583 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"4c800d9a-2aa1-49df-a30b-23f36117381b","Type":"ContainerDied","Data":"92edfdff569346e567dd26f9a973676083700d25c40a2bc4cd054f190ba85cd9"} Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.145021 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.145094 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1","Type":"ContainerDied","Data":"ab8d620faebe0ba44e6ebbb064f0ba00e5ac3afd7b2c87b43eeba8403982f826"} Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.145189 4802 scope.go:117] "RemoveContainer" containerID="ab8d620faebe0ba44e6ebbb064f0ba00e5ac3afd7b2c87b43eeba8403982f826" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.145622 4802 generic.go:334] "Generic (PLEG): container finished" podID="8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1" containerID="ab8d620faebe0ba44e6ebbb064f0ba00e5ac3afd7b2c87b43eeba8403982f826" exitCode=0 Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.145648 4802 generic.go:334] "Generic (PLEG): container finished" podID="8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1" containerID="fb556dfffa44ff8a2ea9fcb1972e6d1fee1ad4d932115325b64d1f082a1db77e" exitCode=1 Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.145709 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1","Type":"ContainerDied","Data":"fb556dfffa44ff8a2ea9fcb1972e6d1fee1ad4d932115325b64d1f082a1db77e"} Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.145732 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1","Type":"ContainerDied","Data":"98e98be1dc1a8f5628b6d377e4a7ea156b46672f1c5b38e99093ff3dff89a6f8"} Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.175313 4802 scope.go:117] "RemoveContainer" containerID="fb556dfffa44ff8a2ea9fcb1972e6d1fee1ad4d932115325b64d1f082a1db77e" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.197822 4802 scope.go:117] "RemoveContainer" containerID="ab8d620faebe0ba44e6ebbb064f0ba00e5ac3afd7b2c87b43eeba8403982f826" Dec 01 20:50:06 crc kubenswrapper[4802]: E1201 20:50:06.198382 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab8d620faebe0ba44e6ebbb064f0ba00e5ac3afd7b2c87b43eeba8403982f826\": container with ID starting with ab8d620faebe0ba44e6ebbb064f0ba00e5ac3afd7b2c87b43eeba8403982f826 not found: ID does not exist" containerID="ab8d620faebe0ba44e6ebbb064f0ba00e5ac3afd7b2c87b43eeba8403982f826" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.198436 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab8d620faebe0ba44e6ebbb064f0ba00e5ac3afd7b2c87b43eeba8403982f826"} err="failed to get container status \"ab8d620faebe0ba44e6ebbb064f0ba00e5ac3afd7b2c87b43eeba8403982f826\": rpc error: code = NotFound desc = could not find container \"ab8d620faebe0ba44e6ebbb064f0ba00e5ac3afd7b2c87b43eeba8403982f826\": container with ID starting with ab8d620faebe0ba44e6ebbb064f0ba00e5ac3afd7b2c87b43eeba8403982f826 not found: ID does not exist" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.198470 4802 scope.go:117] "RemoveContainer" containerID="fb556dfffa44ff8a2ea9fcb1972e6d1fee1ad4d932115325b64d1f082a1db77e" Dec 01 20:50:06 crc kubenswrapper[4802]: E1201 20:50:06.198923 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb556dfffa44ff8a2ea9fcb1972e6d1fee1ad4d932115325b64d1f082a1db77e\": container with ID starting with fb556dfffa44ff8a2ea9fcb1972e6d1fee1ad4d932115325b64d1f082a1db77e not found: ID does not exist" containerID="fb556dfffa44ff8a2ea9fcb1972e6d1fee1ad4d932115325b64d1f082a1db77e" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.198970 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb556dfffa44ff8a2ea9fcb1972e6d1fee1ad4d932115325b64d1f082a1db77e"} err="failed to get container status \"fb556dfffa44ff8a2ea9fcb1972e6d1fee1ad4d932115325b64d1f082a1db77e\": rpc error: code = NotFound desc = could not find container \"fb556dfffa44ff8a2ea9fcb1972e6d1fee1ad4d932115325b64d1f082a1db77e\": container with ID starting with fb556dfffa44ff8a2ea9fcb1972e6d1fee1ad4d932115325b64d1f082a1db77e not found: ID does not exist" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.198991 4802 scope.go:117] "RemoveContainer" containerID="ab8d620faebe0ba44e6ebbb064f0ba00e5ac3afd7b2c87b43eeba8403982f826" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.199426 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab8d620faebe0ba44e6ebbb064f0ba00e5ac3afd7b2c87b43eeba8403982f826"} err="failed to get container status \"ab8d620faebe0ba44e6ebbb064f0ba00e5ac3afd7b2c87b43eeba8403982f826\": rpc error: code = NotFound desc = could not find container \"ab8d620faebe0ba44e6ebbb064f0ba00e5ac3afd7b2c87b43eeba8403982f826\": container with ID starting with ab8d620faebe0ba44e6ebbb064f0ba00e5ac3afd7b2c87b43eeba8403982f826 not found: ID does not exist" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.199446 4802 scope.go:117] "RemoveContainer" containerID="fb556dfffa44ff8a2ea9fcb1972e6d1fee1ad4d932115325b64d1f082a1db77e" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.199753 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb556dfffa44ff8a2ea9fcb1972e6d1fee1ad4d932115325b64d1f082a1db77e"} err="failed to get container status \"fb556dfffa44ff8a2ea9fcb1972e6d1fee1ad4d932115325b64d1f082a1db77e\": rpc error: code = NotFound desc = could not find container \"fb556dfffa44ff8a2ea9fcb1972e6d1fee1ad4d932115325b64d1f082a1db77e\": container with ID starting with fb556dfffa44ff8a2ea9fcb1972e6d1fee1ad4d932115325b64d1f082a1db77e not found: ID does not exist" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.230670 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-ceph\") pod \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.231048 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-etc-machine-id\") pod \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.231164 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-combined-ca-bundle\") pod \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.231250 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-config-data\") pod \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.231268 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-var-lib-manila\") pod \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.231342 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-scripts\") pod \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.231398 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-config-data-custom\") pod \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.231460 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfwjl\" (UniqueName: \"kubernetes.io/projected/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-kube-api-access-tfwjl\") pod \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\" (UID: \"8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1\") " Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.233166 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1" (UID: "8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.233747 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1" (UID: "8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.238966 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1" (UID: "8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.239003 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-kube-api-access-tfwjl" (OuterVolumeSpecName: "kube-api-access-tfwjl") pod "8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1" (UID: "8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1"). InnerVolumeSpecName "kube-api-access-tfwjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.238962 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-ceph" (OuterVolumeSpecName: "ceph") pod "8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1" (UID: "8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.239599 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-scripts" (OuterVolumeSpecName: "scripts") pod "8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1" (UID: "8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.318083 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1" (UID: "8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.333841 4802 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.333878 4802 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.333902 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.333912 4802 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-var-lib-manila\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.333921 4802 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.333929 4802 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.333940 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfwjl\" (UniqueName: \"kubernetes.io/projected/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-kube-api-access-tfwjl\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.346388 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-config-data" (OuterVolumeSpecName: "config-data") pod "8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1" (UID: "8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.436330 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.498805 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.521590 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.544589 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Dec 01 20:50:06 crc kubenswrapper[4802]: E1201 20:50:06.545090 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1" containerName="probe" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.545105 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1" containerName="probe" Dec 01 20:50:06 crc kubenswrapper[4802]: E1201 20:50:06.545143 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1" containerName="manila-share" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.545151 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1" containerName="manila-share" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.545393 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1" containerName="manila-share" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.545419 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1" containerName="probe" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.546634 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.556082 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.573813 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.639920 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54773416-92b4-406d-b8f1-c78331faa64e-config-data\") pod \"manila-share-share1-0\" (UID: \"54773416-92b4-406d-b8f1-c78331faa64e\") " pod="openstack/manila-share-share1-0" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.640244 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pzml\" (UniqueName: \"kubernetes.io/projected/54773416-92b4-406d-b8f1-c78331faa64e-kube-api-access-5pzml\") pod \"manila-share-share1-0\" (UID: \"54773416-92b4-406d-b8f1-c78331faa64e\") " pod="openstack/manila-share-share1-0" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.640285 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/54773416-92b4-406d-b8f1-c78331faa64e-ceph\") pod \"manila-share-share1-0\" (UID: \"54773416-92b4-406d-b8f1-c78331faa64e\") " pod="openstack/manila-share-share1-0" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.640319 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54773416-92b4-406d-b8f1-c78331faa64e-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"54773416-92b4-406d-b8f1-c78331faa64e\") " pod="openstack/manila-share-share1-0" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.640411 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54773416-92b4-406d-b8f1-c78331faa64e-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"54773416-92b4-406d-b8f1-c78331faa64e\") " pod="openstack/manila-share-share1-0" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.640448 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/54773416-92b4-406d-b8f1-c78331faa64e-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"54773416-92b4-406d-b8f1-c78331faa64e\") " pod="openstack/manila-share-share1-0" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.640490 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54773416-92b4-406d-b8f1-c78331faa64e-scripts\") pod \"manila-share-share1-0\" (UID: \"54773416-92b4-406d-b8f1-c78331faa64e\") " pod="openstack/manila-share-share1-0" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.640516 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54773416-92b4-406d-b8f1-c78331faa64e-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"54773416-92b4-406d-b8f1-c78331faa64e\") " pod="openstack/manila-share-share1-0" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.742212 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pzml\" (UniqueName: \"kubernetes.io/projected/54773416-92b4-406d-b8f1-c78331faa64e-kube-api-access-5pzml\") pod \"manila-share-share1-0\" (UID: \"54773416-92b4-406d-b8f1-c78331faa64e\") " pod="openstack/manila-share-share1-0" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.742281 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/54773416-92b4-406d-b8f1-c78331faa64e-ceph\") pod \"manila-share-share1-0\" (UID: \"54773416-92b4-406d-b8f1-c78331faa64e\") " pod="openstack/manila-share-share1-0" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.742310 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54773416-92b4-406d-b8f1-c78331faa64e-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"54773416-92b4-406d-b8f1-c78331faa64e\") " pod="openstack/manila-share-share1-0" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.742382 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54773416-92b4-406d-b8f1-c78331faa64e-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"54773416-92b4-406d-b8f1-c78331faa64e\") " pod="openstack/manila-share-share1-0" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.742409 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/54773416-92b4-406d-b8f1-c78331faa64e-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"54773416-92b4-406d-b8f1-c78331faa64e\") " pod="openstack/manila-share-share1-0" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.742442 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54773416-92b4-406d-b8f1-c78331faa64e-scripts\") pod \"manila-share-share1-0\" (UID: \"54773416-92b4-406d-b8f1-c78331faa64e\") " pod="openstack/manila-share-share1-0" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.742464 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54773416-92b4-406d-b8f1-c78331faa64e-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"54773416-92b4-406d-b8f1-c78331faa64e\") " pod="openstack/manila-share-share1-0" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.742572 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54773416-92b4-406d-b8f1-c78331faa64e-config-data\") pod \"manila-share-share1-0\" (UID: \"54773416-92b4-406d-b8f1-c78331faa64e\") " pod="openstack/manila-share-share1-0" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.742888 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54773416-92b4-406d-b8f1-c78331faa64e-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"54773416-92b4-406d-b8f1-c78331faa64e\") " pod="openstack/manila-share-share1-0" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.743369 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/54773416-92b4-406d-b8f1-c78331faa64e-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"54773416-92b4-406d-b8f1-c78331faa64e\") " pod="openstack/manila-share-share1-0" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.754929 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/54773416-92b4-406d-b8f1-c78331faa64e-ceph\") pod \"manila-share-share1-0\" (UID: \"54773416-92b4-406d-b8f1-c78331faa64e\") " pod="openstack/manila-share-share1-0" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.755493 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54773416-92b4-406d-b8f1-c78331faa64e-scripts\") pod \"manila-share-share1-0\" (UID: \"54773416-92b4-406d-b8f1-c78331faa64e\") " pod="openstack/manila-share-share1-0" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.758727 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54773416-92b4-406d-b8f1-c78331faa64e-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"54773416-92b4-406d-b8f1-c78331faa64e\") " pod="openstack/manila-share-share1-0" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.764755 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54773416-92b4-406d-b8f1-c78331faa64e-config-data\") pod \"manila-share-share1-0\" (UID: \"54773416-92b4-406d-b8f1-c78331faa64e\") " pod="openstack/manila-share-share1-0" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.779969 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pzml\" (UniqueName: \"kubernetes.io/projected/54773416-92b4-406d-b8f1-c78331faa64e-kube-api-access-5pzml\") pod \"manila-share-share1-0\" (UID: \"54773416-92b4-406d-b8f1-c78331faa64e\") " pod="openstack/manila-share-share1-0" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.783806 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54773416-92b4-406d-b8f1-c78331faa64e-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"54773416-92b4-406d-b8f1-c78331faa64e\") " pod="openstack/manila-share-share1-0" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.789354 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1" path="/var/lib/kubelet/pods/8d9a97f6-0863-4e4d-a7b5-12bd7ce866b1/volumes" Dec 01 20:50:06 crc kubenswrapper[4802]: I1201 20:50:06.886370 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 01 20:50:07 crc kubenswrapper[4802]: I1201 20:50:07.162159 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdefd2ec-84d7-4e92-adeb-969ad52e35b6","Type":"ContainerStarted","Data":"a8241ab2b469cee4ad4302bcf60f9ada27d59eb44189190eb8c183011bf908c3"} Dec 01 20:50:07 crc kubenswrapper[4802]: I1201 20:50:07.162599 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 20:50:07 crc kubenswrapper[4802]: I1201 20:50:07.186795 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.978337418 podStartE2EDuration="6.186776556s" podCreationTimestamp="2025-12-01 20:50:01 +0000 UTC" firstStartedPulling="2025-12-01 20:50:02.161461078 +0000 UTC m=+3223.724020719" lastFinishedPulling="2025-12-01 20:50:06.369900216 +0000 UTC m=+3227.932459857" observedRunningTime="2025-12-01 20:50:07.18178978 +0000 UTC m=+3228.744349431" watchObservedRunningTime="2025-12-01 20:50:07.186776556 +0000 UTC m=+3228.749336217" Dec 01 20:50:08 crc kubenswrapper[4802]: I1201 20:50:08.179630 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 01 20:50:09 crc kubenswrapper[4802]: I1201 20:50:09.186628 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"54773416-92b4-406d-b8f1-c78331faa64e","Type":"ContainerStarted","Data":"a64eb82759e6c8ae91b0458323ed2993eb1d4c3c356476a8528c16480093d1ea"} Dec 01 20:50:09 crc kubenswrapper[4802]: I1201 20:50:09.187037 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"54773416-92b4-406d-b8f1-c78331faa64e","Type":"ContainerStarted","Data":"19055b65a9e733e492250fc43e804a5977d9e9c1ab36ae198b16ed0e590063e0"} Dec 01 20:50:09 crc kubenswrapper[4802]: I1201 20:50:09.995724 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.054180 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.119729 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4c800d9a-2aa1-49df-a30b-23f36117381b-etc-machine-id\") pod \"4c800d9a-2aa1-49df-a30b-23f36117381b\" (UID: \"4c800d9a-2aa1-49df-a30b-23f36117381b\") " Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.119834 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c800d9a-2aa1-49df-a30b-23f36117381b-scripts\") pod \"4c800d9a-2aa1-49df-a30b-23f36117381b\" (UID: \"4c800d9a-2aa1-49df-a30b-23f36117381b\") " Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.119918 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rknr\" (UniqueName: \"kubernetes.io/projected/4c800d9a-2aa1-49df-a30b-23f36117381b-kube-api-access-4rknr\") pod \"4c800d9a-2aa1-49df-a30b-23f36117381b\" (UID: \"4c800d9a-2aa1-49df-a30b-23f36117381b\") " Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.119950 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c800d9a-2aa1-49df-a30b-23f36117381b-config-data-custom\") pod \"4c800d9a-2aa1-49df-a30b-23f36117381b\" (UID: \"4c800d9a-2aa1-49df-a30b-23f36117381b\") " Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.119971 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c800d9a-2aa1-49df-a30b-23f36117381b-combined-ca-bundle\") pod \"4c800d9a-2aa1-49df-a30b-23f36117381b\" (UID: \"4c800d9a-2aa1-49df-a30b-23f36117381b\") " Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.120023 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c800d9a-2aa1-49df-a30b-23f36117381b-config-data\") pod \"4c800d9a-2aa1-49df-a30b-23f36117381b\" (UID: \"4c800d9a-2aa1-49df-a30b-23f36117381b\") " Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.120940 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c800d9a-2aa1-49df-a30b-23f36117381b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4c800d9a-2aa1-49df-a30b-23f36117381b" (UID: "4c800d9a-2aa1-49df-a30b-23f36117381b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.130360 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c800d9a-2aa1-49df-a30b-23f36117381b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4c800d9a-2aa1-49df-a30b-23f36117381b" (UID: "4c800d9a-2aa1-49df-a30b-23f36117381b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.130381 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c800d9a-2aa1-49df-a30b-23f36117381b-scripts" (OuterVolumeSpecName: "scripts") pod "4c800d9a-2aa1-49df-a30b-23f36117381b" (UID: "4c800d9a-2aa1-49df-a30b-23f36117381b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.130668 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c800d9a-2aa1-49df-a30b-23f36117381b-kube-api-access-4rknr" (OuterVolumeSpecName: "kube-api-access-4rknr") pod "4c800d9a-2aa1-49df-a30b-23f36117381b" (UID: "4c800d9a-2aa1-49df-a30b-23f36117381b"). InnerVolumeSpecName "kube-api-access-4rknr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.192850 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c800d9a-2aa1-49df-a30b-23f36117381b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c800d9a-2aa1-49df-a30b-23f36117381b" (UID: "4c800d9a-2aa1-49df-a30b-23f36117381b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.203547 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"54773416-92b4-406d-b8f1-c78331faa64e","Type":"ContainerStarted","Data":"f3b75cf1902b762b9dc35631963afa7b2c2be281c28b0490c1b0e1bc254d99e0"} Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.206109 4802 generic.go:334] "Generic (PLEG): container finished" podID="4c800d9a-2aa1-49df-a30b-23f36117381b" containerID="9a7f21b2c785699936a502845d10ba53cad44de5790b6ec6da2d0a8ade617154" exitCode=0 Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.206154 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"4c800d9a-2aa1-49df-a30b-23f36117381b","Type":"ContainerDied","Data":"9a7f21b2c785699936a502845d10ba53cad44de5790b6ec6da2d0a8ade617154"} Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.206185 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"4c800d9a-2aa1-49df-a30b-23f36117381b","Type":"ContainerDied","Data":"83a1a3bfe464258a6ebd0d7e9d32efaa88dfaeb7b81b0e896c50eab590ce42e6"} Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.206282 4802 scope.go:117] "RemoveContainer" containerID="92edfdff569346e567dd26f9a973676083700d25c40a2bc4cd054f190ba85cd9" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.206289 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.223062 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rknr\" (UniqueName: \"kubernetes.io/projected/4c800d9a-2aa1-49df-a30b-23f36117381b-kube-api-access-4rknr\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.223105 4802 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c800d9a-2aa1-49df-a30b-23f36117381b-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.223117 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c800d9a-2aa1-49df-a30b-23f36117381b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.223129 4802 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4c800d9a-2aa1-49df-a30b-23f36117381b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.223140 4802 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c800d9a-2aa1-49df-a30b-23f36117381b-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.227153 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.227136067 podStartE2EDuration="4.227136067s" podCreationTimestamp="2025-12-01 20:50:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:50:10.221885774 +0000 UTC m=+3231.784445415" watchObservedRunningTime="2025-12-01 20:50:10.227136067 +0000 UTC m=+3231.789695708" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.267432 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c800d9a-2aa1-49df-a30b-23f36117381b-config-data" (OuterVolumeSpecName: "config-data") pod "4c800d9a-2aa1-49df-a30b-23f36117381b" (UID: "4c800d9a-2aa1-49df-a30b-23f36117381b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.325014 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c800d9a-2aa1-49df-a30b-23f36117381b-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.349649 4802 scope.go:117] "RemoveContainer" containerID="9a7f21b2c785699936a502845d10ba53cad44de5790b6ec6da2d0a8ade617154" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.371434 4802 scope.go:117] "RemoveContainer" containerID="92edfdff569346e567dd26f9a973676083700d25c40a2bc4cd054f190ba85cd9" Dec 01 20:50:10 crc kubenswrapper[4802]: E1201 20:50:10.372061 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92edfdff569346e567dd26f9a973676083700d25c40a2bc4cd054f190ba85cd9\": container with ID starting with 92edfdff569346e567dd26f9a973676083700d25c40a2bc4cd054f190ba85cd9 not found: ID does not exist" containerID="92edfdff569346e567dd26f9a973676083700d25c40a2bc4cd054f190ba85cd9" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.372118 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92edfdff569346e567dd26f9a973676083700d25c40a2bc4cd054f190ba85cd9"} err="failed to get container status \"92edfdff569346e567dd26f9a973676083700d25c40a2bc4cd054f190ba85cd9\": rpc error: code = NotFound desc = could not find container \"92edfdff569346e567dd26f9a973676083700d25c40a2bc4cd054f190ba85cd9\": container with ID starting with 92edfdff569346e567dd26f9a973676083700d25c40a2bc4cd054f190ba85cd9 not found: ID does not exist" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.372154 4802 scope.go:117] "RemoveContainer" containerID="9a7f21b2c785699936a502845d10ba53cad44de5790b6ec6da2d0a8ade617154" Dec 01 20:50:10 crc kubenswrapper[4802]: E1201 20:50:10.372555 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a7f21b2c785699936a502845d10ba53cad44de5790b6ec6da2d0a8ade617154\": container with ID starting with 9a7f21b2c785699936a502845d10ba53cad44de5790b6ec6da2d0a8ade617154 not found: ID does not exist" containerID="9a7f21b2c785699936a502845d10ba53cad44de5790b6ec6da2d0a8ade617154" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.372597 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a7f21b2c785699936a502845d10ba53cad44de5790b6ec6da2d0a8ade617154"} err="failed to get container status \"9a7f21b2c785699936a502845d10ba53cad44de5790b6ec6da2d0a8ade617154\": rpc error: code = NotFound desc = could not find container \"9a7f21b2c785699936a502845d10ba53cad44de5790b6ec6da2d0a8ade617154\": container with ID starting with 9a7f21b2c785699936a502845d10ba53cad44de5790b6ec6da2d0a8ade617154 not found: ID does not exist" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.555759 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.562250 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.577178 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Dec 01 20:50:10 crc kubenswrapper[4802]: E1201 20:50:10.580971 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c800d9a-2aa1-49df-a30b-23f36117381b" containerName="manila-scheduler" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.581019 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c800d9a-2aa1-49df-a30b-23f36117381b" containerName="manila-scheduler" Dec 01 20:50:10 crc kubenswrapper[4802]: E1201 20:50:10.581151 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c800d9a-2aa1-49df-a30b-23f36117381b" containerName="probe" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.581160 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c800d9a-2aa1-49df-a30b-23f36117381b" containerName="probe" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.582098 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c800d9a-2aa1-49df-a30b-23f36117381b" containerName="manila-scheduler" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.582138 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c800d9a-2aa1-49df-a30b-23f36117381b" containerName="probe" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.584604 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.602379 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.603260 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.732798 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6998a6a9-71cf-4abd-ad6c-5e46bdae11cb-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"6998a6a9-71cf-4abd-ad6c-5e46bdae11cb\") " pod="openstack/manila-scheduler-0" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.732867 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6998a6a9-71cf-4abd-ad6c-5e46bdae11cb-scripts\") pod \"manila-scheduler-0\" (UID: \"6998a6a9-71cf-4abd-ad6c-5e46bdae11cb\") " pod="openstack/manila-scheduler-0" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.732934 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ms8c\" (UniqueName: \"kubernetes.io/projected/6998a6a9-71cf-4abd-ad6c-5e46bdae11cb-kube-api-access-4ms8c\") pod \"manila-scheduler-0\" (UID: \"6998a6a9-71cf-4abd-ad6c-5e46bdae11cb\") " pod="openstack/manila-scheduler-0" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.732953 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6998a6a9-71cf-4abd-ad6c-5e46bdae11cb-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"6998a6a9-71cf-4abd-ad6c-5e46bdae11cb\") " pod="openstack/manila-scheduler-0" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.733072 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6998a6a9-71cf-4abd-ad6c-5e46bdae11cb-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"6998a6a9-71cf-4abd-ad6c-5e46bdae11cb\") " pod="openstack/manila-scheduler-0" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.733186 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6998a6a9-71cf-4abd-ad6c-5e46bdae11cb-config-data\") pod \"manila-scheduler-0\" (UID: \"6998a6a9-71cf-4abd-ad6c-5e46bdae11cb\") " pod="openstack/manila-scheduler-0" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.734085 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c800d9a-2aa1-49df-a30b-23f36117381b" path="/var/lib/kubelet/pods/4c800d9a-2aa1-49df-a30b-23f36117381b/volumes" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.835419 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ms8c\" (UniqueName: \"kubernetes.io/projected/6998a6a9-71cf-4abd-ad6c-5e46bdae11cb-kube-api-access-4ms8c\") pod \"manila-scheduler-0\" (UID: \"6998a6a9-71cf-4abd-ad6c-5e46bdae11cb\") " pod="openstack/manila-scheduler-0" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.835471 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6998a6a9-71cf-4abd-ad6c-5e46bdae11cb-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"6998a6a9-71cf-4abd-ad6c-5e46bdae11cb\") " pod="openstack/manila-scheduler-0" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.835598 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6998a6a9-71cf-4abd-ad6c-5e46bdae11cb-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"6998a6a9-71cf-4abd-ad6c-5e46bdae11cb\") " pod="openstack/manila-scheduler-0" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.835655 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6998a6a9-71cf-4abd-ad6c-5e46bdae11cb-config-data\") pod \"manila-scheduler-0\" (UID: \"6998a6a9-71cf-4abd-ad6c-5e46bdae11cb\") " pod="openstack/manila-scheduler-0" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.835735 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6998a6a9-71cf-4abd-ad6c-5e46bdae11cb-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"6998a6a9-71cf-4abd-ad6c-5e46bdae11cb\") " pod="openstack/manila-scheduler-0" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.835786 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6998a6a9-71cf-4abd-ad6c-5e46bdae11cb-scripts\") pod \"manila-scheduler-0\" (UID: \"6998a6a9-71cf-4abd-ad6c-5e46bdae11cb\") " pod="openstack/manila-scheduler-0" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.837529 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6998a6a9-71cf-4abd-ad6c-5e46bdae11cb-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"6998a6a9-71cf-4abd-ad6c-5e46bdae11cb\") " pod="openstack/manila-scheduler-0" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.840979 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6998a6a9-71cf-4abd-ad6c-5e46bdae11cb-scripts\") pod \"manila-scheduler-0\" (UID: \"6998a6a9-71cf-4abd-ad6c-5e46bdae11cb\") " pod="openstack/manila-scheduler-0" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.843106 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6998a6a9-71cf-4abd-ad6c-5e46bdae11cb-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"6998a6a9-71cf-4abd-ad6c-5e46bdae11cb\") " pod="openstack/manila-scheduler-0" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.849879 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6998a6a9-71cf-4abd-ad6c-5e46bdae11cb-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"6998a6a9-71cf-4abd-ad6c-5e46bdae11cb\") " pod="openstack/manila-scheduler-0" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.850682 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6998a6a9-71cf-4abd-ad6c-5e46bdae11cb-config-data\") pod \"manila-scheduler-0\" (UID: \"6998a6a9-71cf-4abd-ad6c-5e46bdae11cb\") " pod="openstack/manila-scheduler-0" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.851787 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ftfsb" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.851830 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ftfsb" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.875167 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ms8c\" (UniqueName: \"kubernetes.io/projected/6998a6a9-71cf-4abd-ad6c-5e46bdae11cb-kube-api-access-4ms8c\") pod \"manila-scheduler-0\" (UID: \"6998a6a9-71cf-4abd-ad6c-5e46bdae11cb\") " pod="openstack/manila-scheduler-0" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.922023 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ftfsb" Dec 01 20:50:10 crc kubenswrapper[4802]: I1201 20:50:10.924505 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 01 20:50:11 crc kubenswrapper[4802]: I1201 20:50:11.279098 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ftfsb" Dec 01 20:50:11 crc kubenswrapper[4802]: W1201 20:50:11.395597 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6998a6a9_71cf_4abd_ad6c_5e46bdae11cb.slice/crio-b59ca19adaf8d53955c3b0932bccbe0cce69e018fbd3d4f19405c2251818e31a WatchSource:0}: Error finding container b59ca19adaf8d53955c3b0932bccbe0cce69e018fbd3d4f19405c2251818e31a: Status 404 returned error can't find the container with id b59ca19adaf8d53955c3b0932bccbe0cce69e018fbd3d4f19405c2251818e31a Dec 01 20:50:11 crc kubenswrapper[4802]: I1201 20:50:11.395656 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 01 20:50:11 crc kubenswrapper[4802]: I1201 20:50:11.639356 4802 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-58f6ccd776-gr6kp" podUID="f2a61354-70f8-4e95-ab30-6e1d90128879" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.239:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.239:8443: connect: connection refused" Dec 01 20:50:11 crc kubenswrapper[4802]: I1201 20:50:11.689951 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ftfsb"] Dec 01 20:50:12 crc kubenswrapper[4802]: I1201 20:50:12.236960 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"6998a6a9-71cf-4abd-ad6c-5e46bdae11cb","Type":"ContainerStarted","Data":"a8a2ff8e7b6db68d87e22c4b60a48c5599dd8de795b0f57deff2241f31aadb54"} Dec 01 20:50:12 crc kubenswrapper[4802]: I1201 20:50:12.237720 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"6998a6a9-71cf-4abd-ad6c-5e46bdae11cb","Type":"ContainerStarted","Data":"712a7a471275f9ba0b56d5157bb62492b4ee225a0da1f9a8a631c9dffb7bc59f"} Dec 01 20:50:12 crc kubenswrapper[4802]: I1201 20:50:12.237734 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"6998a6a9-71cf-4abd-ad6c-5e46bdae11cb","Type":"ContainerStarted","Data":"b59ca19adaf8d53955c3b0932bccbe0cce69e018fbd3d4f19405c2251818e31a"} Dec 01 20:50:12 crc kubenswrapper[4802]: I1201 20:50:12.260360 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=2.260342858 podStartE2EDuration="2.260342858s" podCreationTimestamp="2025-12-01 20:50:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 20:50:12.250918424 +0000 UTC m=+3233.813478065" watchObservedRunningTime="2025-12-01 20:50:12.260342858 +0000 UTC m=+3233.822902499" Dec 01 20:50:13 crc kubenswrapper[4802]: I1201 20:50:13.243909 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ftfsb" podUID="9a60b7b0-67a6-4c89-9912-ec1cd0b3d405" containerName="registry-server" containerID="cri-o://96925e431f760a148f399bbec47a2cd565e0c7f6d3071feec554ee50c308213a" gracePeriod=2 Dec 01 20:50:13 crc kubenswrapper[4802]: I1201 20:50:13.742696 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ftfsb" Dec 01 20:50:13 crc kubenswrapper[4802]: I1201 20:50:13.897626 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a60b7b0-67a6-4c89-9912-ec1cd0b3d405-catalog-content\") pod \"9a60b7b0-67a6-4c89-9912-ec1cd0b3d405\" (UID: \"9a60b7b0-67a6-4c89-9912-ec1cd0b3d405\") " Dec 01 20:50:13 crc kubenswrapper[4802]: I1201 20:50:13.897850 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a60b7b0-67a6-4c89-9912-ec1cd0b3d405-utilities\") pod \"9a60b7b0-67a6-4c89-9912-ec1cd0b3d405\" (UID: \"9a60b7b0-67a6-4c89-9912-ec1cd0b3d405\") " Dec 01 20:50:13 crc kubenswrapper[4802]: I1201 20:50:13.897989 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whnvl\" (UniqueName: \"kubernetes.io/projected/9a60b7b0-67a6-4c89-9912-ec1cd0b3d405-kube-api-access-whnvl\") pod \"9a60b7b0-67a6-4c89-9912-ec1cd0b3d405\" (UID: \"9a60b7b0-67a6-4c89-9912-ec1cd0b3d405\") " Dec 01 20:50:13 crc kubenswrapper[4802]: I1201 20:50:13.898803 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a60b7b0-67a6-4c89-9912-ec1cd0b3d405-utilities" (OuterVolumeSpecName: "utilities") pod "9a60b7b0-67a6-4c89-9912-ec1cd0b3d405" (UID: "9a60b7b0-67a6-4c89-9912-ec1cd0b3d405"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:50:13 crc kubenswrapper[4802]: I1201 20:50:13.903640 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a60b7b0-67a6-4c89-9912-ec1cd0b3d405-kube-api-access-whnvl" (OuterVolumeSpecName: "kube-api-access-whnvl") pod "9a60b7b0-67a6-4c89-9912-ec1cd0b3d405" (UID: "9a60b7b0-67a6-4c89-9912-ec1cd0b3d405"). InnerVolumeSpecName "kube-api-access-whnvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:50:13 crc kubenswrapper[4802]: I1201 20:50:13.914722 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a60b7b0-67a6-4c89-9912-ec1cd0b3d405-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a60b7b0-67a6-4c89-9912-ec1cd0b3d405" (UID: "9a60b7b0-67a6-4c89-9912-ec1cd0b3d405"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:50:14 crc kubenswrapper[4802]: I1201 20:50:14.001394 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a60b7b0-67a6-4c89-9912-ec1cd0b3d405-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:14 crc kubenswrapper[4802]: I1201 20:50:14.001464 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whnvl\" (UniqueName: \"kubernetes.io/projected/9a60b7b0-67a6-4c89-9912-ec1cd0b3d405-kube-api-access-whnvl\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:14 crc kubenswrapper[4802]: I1201 20:50:14.001492 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a60b7b0-67a6-4c89-9912-ec1cd0b3d405-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:14 crc kubenswrapper[4802]: I1201 20:50:14.252674 4802 generic.go:334] "Generic (PLEG): container finished" podID="9a60b7b0-67a6-4c89-9912-ec1cd0b3d405" containerID="96925e431f760a148f399bbec47a2cd565e0c7f6d3071feec554ee50c308213a" exitCode=0 Dec 01 20:50:14 crc kubenswrapper[4802]: I1201 20:50:14.252733 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ftfsb" event={"ID":"9a60b7b0-67a6-4c89-9912-ec1cd0b3d405","Type":"ContainerDied","Data":"96925e431f760a148f399bbec47a2cd565e0c7f6d3071feec554ee50c308213a"} Dec 01 20:50:14 crc kubenswrapper[4802]: I1201 20:50:14.252780 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ftfsb" Dec 01 20:50:14 crc kubenswrapper[4802]: I1201 20:50:14.252806 4802 scope.go:117] "RemoveContainer" containerID="96925e431f760a148f399bbec47a2cd565e0c7f6d3071feec554ee50c308213a" Dec 01 20:50:14 crc kubenswrapper[4802]: I1201 20:50:14.252791 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ftfsb" event={"ID":"9a60b7b0-67a6-4c89-9912-ec1cd0b3d405","Type":"ContainerDied","Data":"1d731a346ee2328a426d3617e01d990f14d4056b9b276e101572bfa81f400693"} Dec 01 20:50:14 crc kubenswrapper[4802]: I1201 20:50:14.286036 4802 scope.go:117] "RemoveContainer" containerID="f5dc41f792390250abb478532bfe6bc4d9bafc8d18e6fdc7126c00b4d0e1c6ce" Dec 01 20:50:14 crc kubenswrapper[4802]: I1201 20:50:14.294909 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ftfsb"] Dec 01 20:50:14 crc kubenswrapper[4802]: I1201 20:50:14.308448 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ftfsb"] Dec 01 20:50:14 crc kubenswrapper[4802]: I1201 20:50:14.309878 4802 scope.go:117] "RemoveContainer" containerID="54b762a1d7d312eee88013e6da4c8c29da51e9a056efe7bbd8d90523510eff06" Dec 01 20:50:14 crc kubenswrapper[4802]: I1201 20:50:14.362034 4802 scope.go:117] "RemoveContainer" containerID="96925e431f760a148f399bbec47a2cd565e0c7f6d3071feec554ee50c308213a" Dec 01 20:50:14 crc kubenswrapper[4802]: E1201 20:50:14.362533 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96925e431f760a148f399bbec47a2cd565e0c7f6d3071feec554ee50c308213a\": container with ID starting with 96925e431f760a148f399bbec47a2cd565e0c7f6d3071feec554ee50c308213a not found: ID does not exist" containerID="96925e431f760a148f399bbec47a2cd565e0c7f6d3071feec554ee50c308213a" Dec 01 20:50:14 crc kubenswrapper[4802]: I1201 20:50:14.362583 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96925e431f760a148f399bbec47a2cd565e0c7f6d3071feec554ee50c308213a"} err="failed to get container status \"96925e431f760a148f399bbec47a2cd565e0c7f6d3071feec554ee50c308213a\": rpc error: code = NotFound desc = could not find container \"96925e431f760a148f399bbec47a2cd565e0c7f6d3071feec554ee50c308213a\": container with ID starting with 96925e431f760a148f399bbec47a2cd565e0c7f6d3071feec554ee50c308213a not found: ID does not exist" Dec 01 20:50:14 crc kubenswrapper[4802]: I1201 20:50:14.362614 4802 scope.go:117] "RemoveContainer" containerID="f5dc41f792390250abb478532bfe6bc4d9bafc8d18e6fdc7126c00b4d0e1c6ce" Dec 01 20:50:14 crc kubenswrapper[4802]: E1201 20:50:14.363111 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5dc41f792390250abb478532bfe6bc4d9bafc8d18e6fdc7126c00b4d0e1c6ce\": container with ID starting with f5dc41f792390250abb478532bfe6bc4d9bafc8d18e6fdc7126c00b4d0e1c6ce not found: ID does not exist" containerID="f5dc41f792390250abb478532bfe6bc4d9bafc8d18e6fdc7126c00b4d0e1c6ce" Dec 01 20:50:14 crc kubenswrapper[4802]: I1201 20:50:14.363145 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5dc41f792390250abb478532bfe6bc4d9bafc8d18e6fdc7126c00b4d0e1c6ce"} err="failed to get container status \"f5dc41f792390250abb478532bfe6bc4d9bafc8d18e6fdc7126c00b4d0e1c6ce\": rpc error: code = NotFound desc = could not find container \"f5dc41f792390250abb478532bfe6bc4d9bafc8d18e6fdc7126c00b4d0e1c6ce\": container with ID starting with f5dc41f792390250abb478532bfe6bc4d9bafc8d18e6fdc7126c00b4d0e1c6ce not found: ID does not exist" Dec 01 20:50:14 crc kubenswrapper[4802]: I1201 20:50:14.363165 4802 scope.go:117] "RemoveContainer" containerID="54b762a1d7d312eee88013e6da4c8c29da51e9a056efe7bbd8d90523510eff06" Dec 01 20:50:14 crc kubenswrapper[4802]: E1201 20:50:14.363608 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54b762a1d7d312eee88013e6da4c8c29da51e9a056efe7bbd8d90523510eff06\": container with ID starting with 54b762a1d7d312eee88013e6da4c8c29da51e9a056efe7bbd8d90523510eff06 not found: ID does not exist" containerID="54b762a1d7d312eee88013e6da4c8c29da51e9a056efe7bbd8d90523510eff06" Dec 01 20:50:14 crc kubenswrapper[4802]: I1201 20:50:14.363637 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54b762a1d7d312eee88013e6da4c8c29da51e9a056efe7bbd8d90523510eff06"} err="failed to get container status \"54b762a1d7d312eee88013e6da4c8c29da51e9a056efe7bbd8d90523510eff06\": rpc error: code = NotFound desc = could not find container \"54b762a1d7d312eee88013e6da4c8c29da51e9a056efe7bbd8d90523510eff06\": container with ID starting with 54b762a1d7d312eee88013e6da4c8c29da51e9a056efe7bbd8d90523510eff06 not found: ID does not exist" Dec 01 20:50:14 crc kubenswrapper[4802]: I1201 20:50:14.729909 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a60b7b0-67a6-4c89-9912-ec1cd0b3d405" path="/var/lib/kubelet/pods/9a60b7b0-67a6-4c89-9912-ec1cd0b3d405/volumes" Dec 01 20:50:16 crc kubenswrapper[4802]: I1201 20:50:16.887492 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Dec 01 20:50:20 crc kubenswrapper[4802]: I1201 20:50:20.926169 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Dec 01 20:50:21 crc kubenswrapper[4802]: I1201 20:50:21.638463 4802 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-58f6ccd776-gr6kp" podUID="f2a61354-70f8-4e95-ab30-6e1d90128879" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.239:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.239:8443: connect: connection refused" Dec 01 20:50:21 crc kubenswrapper[4802]: I1201 20:50:21.638896 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:50:28 crc kubenswrapper[4802]: I1201 20:50:28.398352 4802 generic.go:334] "Generic (PLEG): container finished" podID="f2a61354-70f8-4e95-ab30-6e1d90128879" containerID="de897dcdcc75080e2b98ab343ee6181eade9fc23a8da10a7e641218df7f93229" exitCode=137 Dec 01 20:50:28 crc kubenswrapper[4802]: I1201 20:50:28.398452 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58f6ccd776-gr6kp" event={"ID":"f2a61354-70f8-4e95-ab30-6e1d90128879","Type":"ContainerDied","Data":"de897dcdcc75080e2b98ab343ee6181eade9fc23a8da10a7e641218df7f93229"} Dec 01 20:50:28 crc kubenswrapper[4802]: I1201 20:50:28.398782 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58f6ccd776-gr6kp" event={"ID":"f2a61354-70f8-4e95-ab30-6e1d90128879","Type":"ContainerDied","Data":"453c089c12f0c993eda2a9db6f02229a6c2f6ff40d03ad1dd2c9798f24ed434d"} Dec 01 20:50:28 crc kubenswrapper[4802]: I1201 20:50:28.398797 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="453c089c12f0c993eda2a9db6f02229a6c2f6ff40d03ad1dd2c9798f24ed434d" Dec 01 20:50:28 crc kubenswrapper[4802]: I1201 20:50:28.435805 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Dec 01 20:50:28 crc kubenswrapper[4802]: I1201 20:50:28.470760 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:50:28 crc kubenswrapper[4802]: I1201 20:50:28.593299 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2a61354-70f8-4e95-ab30-6e1d90128879-logs\") pod \"f2a61354-70f8-4e95-ab30-6e1d90128879\" (UID: \"f2a61354-70f8-4e95-ab30-6e1d90128879\") " Dec 01 20:50:28 crc kubenswrapper[4802]: I1201 20:50:28.593413 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2a61354-70f8-4e95-ab30-6e1d90128879-scripts\") pod \"f2a61354-70f8-4e95-ab30-6e1d90128879\" (UID: \"f2a61354-70f8-4e95-ab30-6e1d90128879\") " Dec 01 20:50:28 crc kubenswrapper[4802]: I1201 20:50:28.593508 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f2a61354-70f8-4e95-ab30-6e1d90128879-horizon-secret-key\") pod \"f2a61354-70f8-4e95-ab30-6e1d90128879\" (UID: \"f2a61354-70f8-4e95-ab30-6e1d90128879\") " Dec 01 20:50:28 crc kubenswrapper[4802]: I1201 20:50:28.593594 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2a61354-70f8-4e95-ab30-6e1d90128879-horizon-tls-certs\") pod \"f2a61354-70f8-4e95-ab30-6e1d90128879\" (UID: \"f2a61354-70f8-4e95-ab30-6e1d90128879\") " Dec 01 20:50:28 crc kubenswrapper[4802]: I1201 20:50:28.593648 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2a61354-70f8-4e95-ab30-6e1d90128879-combined-ca-bundle\") pod \"f2a61354-70f8-4e95-ab30-6e1d90128879\" (UID: \"f2a61354-70f8-4e95-ab30-6e1d90128879\") " Dec 01 20:50:28 crc kubenswrapper[4802]: I1201 20:50:28.593721 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f2a61354-70f8-4e95-ab30-6e1d90128879-config-data\") pod \"f2a61354-70f8-4e95-ab30-6e1d90128879\" (UID: \"f2a61354-70f8-4e95-ab30-6e1d90128879\") " Dec 01 20:50:28 crc kubenswrapper[4802]: I1201 20:50:28.593757 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg6mq\" (UniqueName: \"kubernetes.io/projected/f2a61354-70f8-4e95-ab30-6e1d90128879-kube-api-access-lg6mq\") pod \"f2a61354-70f8-4e95-ab30-6e1d90128879\" (UID: \"f2a61354-70f8-4e95-ab30-6e1d90128879\") " Dec 01 20:50:28 crc kubenswrapper[4802]: I1201 20:50:28.595106 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2a61354-70f8-4e95-ab30-6e1d90128879-logs" (OuterVolumeSpecName: "logs") pod "f2a61354-70f8-4e95-ab30-6e1d90128879" (UID: "f2a61354-70f8-4e95-ab30-6e1d90128879"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:50:28 crc kubenswrapper[4802]: I1201 20:50:28.595467 4802 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2a61354-70f8-4e95-ab30-6e1d90128879-logs\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:28 crc kubenswrapper[4802]: I1201 20:50:28.599743 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2a61354-70f8-4e95-ab30-6e1d90128879-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f2a61354-70f8-4e95-ab30-6e1d90128879" (UID: "f2a61354-70f8-4e95-ab30-6e1d90128879"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:50:28 crc kubenswrapper[4802]: I1201 20:50:28.599885 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2a61354-70f8-4e95-ab30-6e1d90128879-kube-api-access-lg6mq" (OuterVolumeSpecName: "kube-api-access-lg6mq") pod "f2a61354-70f8-4e95-ab30-6e1d90128879" (UID: "f2a61354-70f8-4e95-ab30-6e1d90128879"). InnerVolumeSpecName "kube-api-access-lg6mq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:50:28 crc kubenswrapper[4802]: I1201 20:50:28.622940 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2a61354-70f8-4e95-ab30-6e1d90128879-config-data" (OuterVolumeSpecName: "config-data") pod "f2a61354-70f8-4e95-ab30-6e1d90128879" (UID: "f2a61354-70f8-4e95-ab30-6e1d90128879"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:50:28 crc kubenswrapper[4802]: I1201 20:50:28.625659 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2a61354-70f8-4e95-ab30-6e1d90128879-scripts" (OuterVolumeSpecName: "scripts") pod "f2a61354-70f8-4e95-ab30-6e1d90128879" (UID: "f2a61354-70f8-4e95-ab30-6e1d90128879"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:50:28 crc kubenswrapper[4802]: I1201 20:50:28.627378 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2a61354-70f8-4e95-ab30-6e1d90128879-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2a61354-70f8-4e95-ab30-6e1d90128879" (UID: "f2a61354-70f8-4e95-ab30-6e1d90128879"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:50:28 crc kubenswrapper[4802]: I1201 20:50:28.650006 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2a61354-70f8-4e95-ab30-6e1d90128879-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "f2a61354-70f8-4e95-ab30-6e1d90128879" (UID: "f2a61354-70f8-4e95-ab30-6e1d90128879"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:50:28 crc kubenswrapper[4802]: I1201 20:50:28.696967 4802 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2a61354-70f8-4e95-ab30-6e1d90128879-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:28 crc kubenswrapper[4802]: I1201 20:50:28.697001 4802 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f2a61354-70f8-4e95-ab30-6e1d90128879-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:28 crc kubenswrapper[4802]: I1201 20:50:28.697013 4802 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2a61354-70f8-4e95-ab30-6e1d90128879-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:28 crc kubenswrapper[4802]: I1201 20:50:28.697023 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2a61354-70f8-4e95-ab30-6e1d90128879-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:28 crc kubenswrapper[4802]: I1201 20:50:28.697032 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f2a61354-70f8-4e95-ab30-6e1d90128879-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:28 crc kubenswrapper[4802]: I1201 20:50:28.697041 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg6mq\" (UniqueName: \"kubernetes.io/projected/f2a61354-70f8-4e95-ab30-6e1d90128879-kube-api-access-lg6mq\") on node \"crc\" DevicePath \"\"" Dec 01 20:50:29 crc kubenswrapper[4802]: I1201 20:50:29.407073 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58f6ccd776-gr6kp" Dec 01 20:50:29 crc kubenswrapper[4802]: I1201 20:50:29.436966 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-58f6ccd776-gr6kp"] Dec 01 20:50:29 crc kubenswrapper[4802]: I1201 20:50:29.448317 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-58f6ccd776-gr6kp"] Dec 01 20:50:30 crc kubenswrapper[4802]: I1201 20:50:30.731532 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2a61354-70f8-4e95-ab30-6e1d90128879" path="/var/lib/kubelet/pods/f2a61354-70f8-4e95-ab30-6e1d90128879/volumes" Dec 01 20:50:31 crc kubenswrapper[4802]: I1201 20:50:31.730362 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 20:50:32 crc kubenswrapper[4802]: I1201 20:50:32.520087 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.053512 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 01 20:51:36 crc kubenswrapper[4802]: E1201 20:51:36.056295 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a60b7b0-67a6-4c89-9912-ec1cd0b3d405" containerName="extract-content" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.056442 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a60b7b0-67a6-4c89-9912-ec1cd0b3d405" containerName="extract-content" Dec 01 20:51:36 crc kubenswrapper[4802]: E1201 20:51:36.056577 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a61354-70f8-4e95-ab30-6e1d90128879" containerName="horizon" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.056671 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a61354-70f8-4e95-ab30-6e1d90128879" containerName="horizon" Dec 01 20:51:36 crc kubenswrapper[4802]: E1201 20:51:36.056758 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a60b7b0-67a6-4c89-9912-ec1cd0b3d405" containerName="registry-server" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.056830 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a60b7b0-67a6-4c89-9912-ec1cd0b3d405" containerName="registry-server" Dec 01 20:51:36 crc kubenswrapper[4802]: E1201 20:51:36.056909 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a60b7b0-67a6-4c89-9912-ec1cd0b3d405" containerName="extract-utilities" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.056982 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a60b7b0-67a6-4c89-9912-ec1cd0b3d405" containerName="extract-utilities" Dec 01 20:51:36 crc kubenswrapper[4802]: E1201 20:51:36.057092 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a61354-70f8-4e95-ab30-6e1d90128879" containerName="horizon-log" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.057192 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a61354-70f8-4e95-ab30-6e1d90128879" containerName="horizon-log" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.057575 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2a61354-70f8-4e95-ab30-6e1d90128879" containerName="horizon-log" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.057694 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2a61354-70f8-4e95-ab30-6e1d90128879" containerName="horizon" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.057775 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a60b7b0-67a6-4c89-9912-ec1cd0b3d405" containerName="registry-server" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.058945 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.061268 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.062035 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.063583 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.070284 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-msrs9" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.075691 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.251363 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jkrw\" (UniqueName: \"kubernetes.io/projected/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-kube-api-access-6jkrw\") pod \"tempest-tests-tempest\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.251792 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-config-data\") pod \"tempest-tests-tempest\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.251821 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.251865 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.251892 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.251969 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.252006 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.252022 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.252052 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.354150 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.354272 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.354305 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.354361 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.354410 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jkrw\" (UniqueName: \"kubernetes.io/projected/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-kube-api-access-6jkrw\") pod \"tempest-tests-tempest\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.354482 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-config-data\") pod \"tempest-tests-tempest\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.354520 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.354543 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.354579 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.354952 4802 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.356110 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.356340 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.356744 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.357130 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-config-data\") pod \"tempest-tests-tempest\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.364300 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.367798 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.368082 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.372951 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jkrw\" (UniqueName: \"kubernetes.io/projected/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-kube-api-access-6jkrw\") pod \"tempest-tests-tempest\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.396320 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.685140 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 01 20:51:36 crc kubenswrapper[4802]: I1201 20:51:36.991298 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 01 20:51:37 crc kubenswrapper[4802]: I1201 20:51:37.091261 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"654db8d6-c501-48bf-bfeb-81f07e7c0e2e","Type":"ContainerStarted","Data":"22686f08cb1db62213f772d5f2957b22c0f13c15b97e9f3511b44ef8608c9854"} Dec 01 20:52:10 crc kubenswrapper[4802]: E1201 20:52:10.926549 4802 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 01 20:52:10 crc kubenswrapper[4802]: E1201 20:52:10.927036 4802 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6jkrw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(654db8d6-c501-48bf-bfeb-81f07e7c0e2e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 20:52:10 crc kubenswrapper[4802]: E1201 20:52:10.928405 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="654db8d6-c501-48bf-bfeb-81f07e7c0e2e" Dec 01 20:52:11 crc kubenswrapper[4802]: E1201 20:52:11.499237 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="654db8d6-c501-48bf-bfeb-81f07e7c0e2e" Dec 01 20:52:24 crc kubenswrapper[4802]: I1201 20:52:24.461867 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 01 20:52:25 crc kubenswrapper[4802]: I1201 20:52:25.657370 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"654db8d6-c501-48bf-bfeb-81f07e7c0e2e","Type":"ContainerStarted","Data":"0f47623f60ecb67c29f67445d3754c49c6201e0a67c08b8febdf34a7a1401004"} Dec 01 20:52:25 crc kubenswrapper[4802]: I1201 20:52:25.693486 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.231646061 podStartE2EDuration="50.693465319s" podCreationTimestamp="2025-12-01 20:51:35 +0000 UTC" firstStartedPulling="2025-12-01 20:51:36.997501106 +0000 UTC m=+3318.560060747" lastFinishedPulling="2025-12-01 20:52:24.459320364 +0000 UTC m=+3366.021880005" observedRunningTime="2025-12-01 20:52:25.688562677 +0000 UTC m=+3367.251122358" watchObservedRunningTime="2025-12-01 20:52:25.693465319 +0000 UTC m=+3367.256024980" Dec 01 20:52:28 crc kubenswrapper[4802]: I1201 20:52:28.088618 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:52:28 crc kubenswrapper[4802]: I1201 20:52:28.089113 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:52:58 crc kubenswrapper[4802]: I1201 20:52:58.088496 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:52:58 crc kubenswrapper[4802]: I1201 20:52:58.089260 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:53:28 crc kubenswrapper[4802]: I1201 20:53:28.088973 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:53:28 crc kubenswrapper[4802]: I1201 20:53:28.089663 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:53:28 crc kubenswrapper[4802]: I1201 20:53:28.089718 4802 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 20:53:28 crc kubenswrapper[4802]: I1201 20:53:28.090705 4802 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f8b3c5182085db2dc50eb18e66872b4d95d4989747b298b4b3a4f8c464087706"} pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 20:53:28 crc kubenswrapper[4802]: I1201 20:53:28.090782 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" containerID="cri-o://f8b3c5182085db2dc50eb18e66872b4d95d4989747b298b4b3a4f8c464087706" gracePeriod=600 Dec 01 20:53:28 crc kubenswrapper[4802]: I1201 20:53:28.303821 4802 generic.go:334] "Generic (PLEG): container finished" podID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerID="f8b3c5182085db2dc50eb18e66872b4d95d4989747b298b4b3a4f8c464087706" exitCode=0 Dec 01 20:53:28 crc kubenswrapper[4802]: I1201 20:53:28.303914 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerDied","Data":"f8b3c5182085db2dc50eb18e66872b4d95d4989747b298b4b3a4f8c464087706"} Dec 01 20:53:28 crc kubenswrapper[4802]: I1201 20:53:28.304250 4802 scope.go:117] "RemoveContainer" containerID="538c357bb384951b89c9bfc1d2c5541dec7ff0029e20fa2bca33a8d4f9d78098" Dec 01 20:53:29 crc kubenswrapper[4802]: I1201 20:53:29.317911 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerStarted","Data":"4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca"} Dec 01 20:54:48 crc kubenswrapper[4802]: I1201 20:54:48.059045 4802 generic.go:334] "Generic (PLEG): container finished" podID="654db8d6-c501-48bf-bfeb-81f07e7c0e2e" containerID="0f47623f60ecb67c29f67445d3754c49c6201e0a67c08b8febdf34a7a1401004" exitCode=0 Dec 01 20:54:48 crc kubenswrapper[4802]: I1201 20:54:48.059147 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"654db8d6-c501-48bf-bfeb-81f07e7c0e2e","Type":"ContainerDied","Data":"0f47623f60ecb67c29f67445d3754c49c6201e0a67c08b8febdf34a7a1401004"} Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.535867 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.683846 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-test-operator-ephemeral-temporary\") pod \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.683955 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jkrw\" (UniqueName: \"kubernetes.io/projected/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-kube-api-access-6jkrw\") pod \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.684021 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-openstack-config\") pod \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.684251 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-ca-certs\") pod \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.684292 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.684374 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-test-operator-ephemeral-workdir\") pod \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.684461 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-ssh-key\") pod \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.684544 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-openstack-config-secret\") pod \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.684604 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-config-data\") pod \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\" (UID: \"654db8d6-c501-48bf-bfeb-81f07e7c0e2e\") " Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.684994 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "654db8d6-c501-48bf-bfeb-81f07e7c0e2e" (UID: "654db8d6-c501-48bf-bfeb-81f07e7c0e2e"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.685533 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-config-data" (OuterVolumeSpecName: "config-data") pod "654db8d6-c501-48bf-bfeb-81f07e7c0e2e" (UID: "654db8d6-c501-48bf-bfeb-81f07e7c0e2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.685550 4802 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.688151 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "654db8d6-c501-48bf-bfeb-81f07e7c0e2e" (UID: "654db8d6-c501-48bf-bfeb-81f07e7c0e2e"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.690495 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-kube-api-access-6jkrw" (OuterVolumeSpecName: "kube-api-access-6jkrw") pod "654db8d6-c501-48bf-bfeb-81f07e7c0e2e" (UID: "654db8d6-c501-48bf-bfeb-81f07e7c0e2e"). InnerVolumeSpecName "kube-api-access-6jkrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.690614 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "654db8d6-c501-48bf-bfeb-81f07e7c0e2e" (UID: "654db8d6-c501-48bf-bfeb-81f07e7c0e2e"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.713301 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "654db8d6-c501-48bf-bfeb-81f07e7c0e2e" (UID: "654db8d6-c501-48bf-bfeb-81f07e7c0e2e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.721311 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "654db8d6-c501-48bf-bfeb-81f07e7c0e2e" (UID: "654db8d6-c501-48bf-bfeb-81f07e7c0e2e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.744891 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "654db8d6-c501-48bf-bfeb-81f07e7c0e2e" (UID: "654db8d6-c501-48bf-bfeb-81f07e7c0e2e"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.763500 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "654db8d6-c501-48bf-bfeb-81f07e7c0e2e" (UID: "654db8d6-c501-48bf-bfeb-81f07e7c0e2e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.787290 4802 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.787329 4802 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.787344 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.787358 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jkrw\" (UniqueName: \"kubernetes.io/projected/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-kube-api-access-6jkrw\") on node \"crc\" DevicePath \"\"" Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.787371 4802 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.787395 4802 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.787408 4802 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.787423 4802 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/654db8d6-c501-48bf-bfeb-81f07e7c0e2e-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.810397 4802 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 01 20:54:49 crc kubenswrapper[4802]: I1201 20:54:49.888857 4802 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 01 20:54:50 crc kubenswrapper[4802]: I1201 20:54:50.081653 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"654db8d6-c501-48bf-bfeb-81f07e7c0e2e","Type":"ContainerDied","Data":"22686f08cb1db62213f772d5f2957b22c0f13c15b97e9f3511b44ef8608c9854"} Dec 01 20:54:50 crc kubenswrapper[4802]: I1201 20:54:50.081695 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22686f08cb1db62213f772d5f2957b22c0f13c15b97e9f3511b44ef8608c9854" Dec 01 20:54:50 crc kubenswrapper[4802]: I1201 20:54:50.081746 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 01 20:55:01 crc kubenswrapper[4802]: I1201 20:55:01.843519 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 01 20:55:01 crc kubenswrapper[4802]: E1201 20:55:01.844843 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="654db8d6-c501-48bf-bfeb-81f07e7c0e2e" containerName="tempest-tests-tempest-tests-runner" Dec 01 20:55:01 crc kubenswrapper[4802]: I1201 20:55:01.844860 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="654db8d6-c501-48bf-bfeb-81f07e7c0e2e" containerName="tempest-tests-tempest-tests-runner" Dec 01 20:55:01 crc kubenswrapper[4802]: I1201 20:55:01.845041 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="654db8d6-c501-48bf-bfeb-81f07e7c0e2e" containerName="tempest-tests-tempest-tests-runner" Dec 01 20:55:01 crc kubenswrapper[4802]: I1201 20:55:01.845749 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 20:55:01 crc kubenswrapper[4802]: I1201 20:55:01.848991 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-msrs9" Dec 01 20:55:01 crc kubenswrapper[4802]: I1201 20:55:01.878537 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 01 20:55:02 crc kubenswrapper[4802]: I1201 20:55:02.016522 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ggbw\" (UniqueName: \"kubernetes.io/projected/fa5d0b95-078d-4cb3-a597-5af9283e6503-kube-api-access-5ggbw\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fa5d0b95-078d-4cb3-a597-5af9283e6503\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 20:55:02 crc kubenswrapper[4802]: I1201 20:55:02.016638 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fa5d0b95-078d-4cb3-a597-5af9283e6503\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 20:55:02 crc kubenswrapper[4802]: I1201 20:55:02.117933 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ggbw\" (UniqueName: \"kubernetes.io/projected/fa5d0b95-078d-4cb3-a597-5af9283e6503-kube-api-access-5ggbw\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fa5d0b95-078d-4cb3-a597-5af9283e6503\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 20:55:02 crc kubenswrapper[4802]: I1201 20:55:02.118107 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fa5d0b95-078d-4cb3-a597-5af9283e6503\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 20:55:02 crc kubenswrapper[4802]: I1201 20:55:02.118576 4802 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fa5d0b95-078d-4cb3-a597-5af9283e6503\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 20:55:02 crc kubenswrapper[4802]: I1201 20:55:02.145979 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fa5d0b95-078d-4cb3-a597-5af9283e6503\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 20:55:02 crc kubenswrapper[4802]: I1201 20:55:02.149916 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ggbw\" (UniqueName: \"kubernetes.io/projected/fa5d0b95-078d-4cb3-a597-5af9283e6503-kube-api-access-5ggbw\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"fa5d0b95-078d-4cb3-a597-5af9283e6503\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 20:55:02 crc kubenswrapper[4802]: I1201 20:55:02.176471 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 20:55:02 crc kubenswrapper[4802]: I1201 20:55:02.614779 4802 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 20:55:02 crc kubenswrapper[4802]: I1201 20:55:02.617388 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 01 20:55:03 crc kubenswrapper[4802]: I1201 20:55:03.201759 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"fa5d0b95-078d-4cb3-a597-5af9283e6503","Type":"ContainerStarted","Data":"3ab48a01219a09251d31bf62de2dfa00502137b537d069d2c4316dabe78a7bb4"} Dec 01 20:55:05 crc kubenswrapper[4802]: I1201 20:55:05.227701 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"fa5d0b95-078d-4cb3-a597-5af9283e6503","Type":"ContainerStarted","Data":"f14c48958f7a7463f754b344ae27ac0c764c0af29d2fe72c170825fc868ffc62"} Dec 01 20:55:05 crc kubenswrapper[4802]: I1201 20:55:05.248489 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.34474593 podStartE2EDuration="4.248467674s" podCreationTimestamp="2025-12-01 20:55:01 +0000 UTC" firstStartedPulling="2025-12-01 20:55:02.614535885 +0000 UTC m=+3524.177095526" lastFinishedPulling="2025-12-01 20:55:04.518257629 +0000 UTC m=+3526.080817270" observedRunningTime="2025-12-01 20:55:05.247326799 +0000 UTC m=+3526.809886440" watchObservedRunningTime="2025-12-01 20:55:05.248467674 +0000 UTC m=+3526.811027315" Dec 01 20:55:27 crc kubenswrapper[4802]: I1201 20:55:27.512125 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t6rhk/must-gather-25xw2"] Dec 01 20:55:27 crc kubenswrapper[4802]: I1201 20:55:27.517429 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6rhk/must-gather-25xw2" Dec 01 20:55:27 crc kubenswrapper[4802]: I1201 20:55:27.526939 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-t6rhk"/"kube-root-ca.crt" Dec 01 20:55:27 crc kubenswrapper[4802]: I1201 20:55:27.528485 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-t6rhk"/"openshift-service-ca.crt" Dec 01 20:55:27 crc kubenswrapper[4802]: I1201 20:55:27.536067 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t6rhk/must-gather-25xw2"] Dec 01 20:55:27 crc kubenswrapper[4802]: I1201 20:55:27.650998 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z29zw\" (UniqueName: \"kubernetes.io/projected/d1950264-c629-4757-b443-dcecf41ae2a1-kube-api-access-z29zw\") pod \"must-gather-25xw2\" (UID: \"d1950264-c629-4757-b443-dcecf41ae2a1\") " pod="openshift-must-gather-t6rhk/must-gather-25xw2" Dec 01 20:55:27 crc kubenswrapper[4802]: I1201 20:55:27.651105 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d1950264-c629-4757-b443-dcecf41ae2a1-must-gather-output\") pod \"must-gather-25xw2\" (UID: \"d1950264-c629-4757-b443-dcecf41ae2a1\") " pod="openshift-must-gather-t6rhk/must-gather-25xw2" Dec 01 20:55:27 crc kubenswrapper[4802]: I1201 20:55:27.753380 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d1950264-c629-4757-b443-dcecf41ae2a1-must-gather-output\") pod \"must-gather-25xw2\" (UID: \"d1950264-c629-4757-b443-dcecf41ae2a1\") " pod="openshift-must-gather-t6rhk/must-gather-25xw2" Dec 01 20:55:27 crc kubenswrapper[4802]: I1201 20:55:27.753546 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z29zw\" (UniqueName: \"kubernetes.io/projected/d1950264-c629-4757-b443-dcecf41ae2a1-kube-api-access-z29zw\") pod \"must-gather-25xw2\" (UID: \"d1950264-c629-4757-b443-dcecf41ae2a1\") " pod="openshift-must-gather-t6rhk/must-gather-25xw2" Dec 01 20:55:27 crc kubenswrapper[4802]: I1201 20:55:27.753899 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d1950264-c629-4757-b443-dcecf41ae2a1-must-gather-output\") pod \"must-gather-25xw2\" (UID: \"d1950264-c629-4757-b443-dcecf41ae2a1\") " pod="openshift-must-gather-t6rhk/must-gather-25xw2" Dec 01 20:55:27 crc kubenswrapper[4802]: I1201 20:55:27.777937 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z29zw\" (UniqueName: \"kubernetes.io/projected/d1950264-c629-4757-b443-dcecf41ae2a1-kube-api-access-z29zw\") pod \"must-gather-25xw2\" (UID: \"d1950264-c629-4757-b443-dcecf41ae2a1\") " pod="openshift-must-gather-t6rhk/must-gather-25xw2" Dec 01 20:55:27 crc kubenswrapper[4802]: I1201 20:55:27.819043 4802 scope.go:117] "RemoveContainer" containerID="232156007ccc94a24dde9aeeb757f17c9242d6806240397fa3b536de5424c9e1" Dec 01 20:55:27 crc kubenswrapper[4802]: I1201 20:55:27.839363 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6rhk/must-gather-25xw2" Dec 01 20:55:28 crc kubenswrapper[4802]: I1201 20:55:28.010583 4802 scope.go:117] "RemoveContainer" containerID="de897dcdcc75080e2b98ab343ee6181eade9fc23a8da10a7e641218df7f93229" Dec 01 20:55:28 crc kubenswrapper[4802]: I1201 20:55:28.088228 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:55:28 crc kubenswrapper[4802]: I1201 20:55:28.088292 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:55:28 crc kubenswrapper[4802]: I1201 20:55:28.544847 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t6rhk/must-gather-25xw2"] Dec 01 20:55:29 crc kubenswrapper[4802]: I1201 20:55:29.466547 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6rhk/must-gather-25xw2" event={"ID":"d1950264-c629-4757-b443-dcecf41ae2a1","Type":"ContainerStarted","Data":"4e7cb3489fb2e69e27b231d0a00139a08b5d97faa17d56562139f00abbc5b01a"} Dec 01 20:55:34 crc kubenswrapper[4802]: I1201 20:55:34.512881 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6rhk/must-gather-25xw2" event={"ID":"d1950264-c629-4757-b443-dcecf41ae2a1","Type":"ContainerStarted","Data":"4e92fcd5097ac294bad9001bafd4a9e0135c3bacaf47fd2a1d8d1771064a6a26"} Dec 01 20:55:34 crc kubenswrapper[4802]: I1201 20:55:34.513485 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6rhk/must-gather-25xw2" event={"ID":"d1950264-c629-4757-b443-dcecf41ae2a1","Type":"ContainerStarted","Data":"21946d20041c4d2bb602cc4543cc6c921c42a26b9a9d1d079a3a66fcb89e41fa"} Dec 01 20:55:34 crc kubenswrapper[4802]: I1201 20:55:34.533361 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t6rhk/must-gather-25xw2" podStartSLOduration=3.33301779 podStartE2EDuration="7.533342333s" podCreationTimestamp="2025-12-01 20:55:27 +0000 UTC" firstStartedPulling="2025-12-01 20:55:28.548669466 +0000 UTC m=+3550.111229107" lastFinishedPulling="2025-12-01 20:55:32.748994009 +0000 UTC m=+3554.311553650" observedRunningTime="2025-12-01 20:55:34.526171839 +0000 UTC m=+3556.088731480" watchObservedRunningTime="2025-12-01 20:55:34.533342333 +0000 UTC m=+3556.095901974" Dec 01 20:55:38 crc kubenswrapper[4802]: I1201 20:55:38.133341 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t6rhk/crc-debug-97qnl"] Dec 01 20:55:38 crc kubenswrapper[4802]: I1201 20:55:38.135452 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6rhk/crc-debug-97qnl" Dec 01 20:55:38 crc kubenswrapper[4802]: I1201 20:55:38.137524 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-t6rhk"/"default-dockercfg-slhsj" Dec 01 20:55:38 crc kubenswrapper[4802]: I1201 20:55:38.209535 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14d62371-fcf1-4de3-b48f-b8cbd79f8f97-host\") pod \"crc-debug-97qnl\" (UID: \"14d62371-fcf1-4de3-b48f-b8cbd79f8f97\") " pod="openshift-must-gather-t6rhk/crc-debug-97qnl" Dec 01 20:55:38 crc kubenswrapper[4802]: I1201 20:55:38.209658 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7ddz\" (UniqueName: \"kubernetes.io/projected/14d62371-fcf1-4de3-b48f-b8cbd79f8f97-kube-api-access-r7ddz\") pod \"crc-debug-97qnl\" (UID: \"14d62371-fcf1-4de3-b48f-b8cbd79f8f97\") " pod="openshift-must-gather-t6rhk/crc-debug-97qnl" Dec 01 20:55:38 crc kubenswrapper[4802]: I1201 20:55:38.311170 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7ddz\" (UniqueName: \"kubernetes.io/projected/14d62371-fcf1-4de3-b48f-b8cbd79f8f97-kube-api-access-r7ddz\") pod \"crc-debug-97qnl\" (UID: \"14d62371-fcf1-4de3-b48f-b8cbd79f8f97\") " pod="openshift-must-gather-t6rhk/crc-debug-97qnl" Dec 01 20:55:38 crc kubenswrapper[4802]: I1201 20:55:38.311332 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14d62371-fcf1-4de3-b48f-b8cbd79f8f97-host\") pod \"crc-debug-97qnl\" (UID: \"14d62371-fcf1-4de3-b48f-b8cbd79f8f97\") " pod="openshift-must-gather-t6rhk/crc-debug-97qnl" Dec 01 20:55:38 crc kubenswrapper[4802]: I1201 20:55:38.311508 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14d62371-fcf1-4de3-b48f-b8cbd79f8f97-host\") pod \"crc-debug-97qnl\" (UID: \"14d62371-fcf1-4de3-b48f-b8cbd79f8f97\") " pod="openshift-must-gather-t6rhk/crc-debug-97qnl" Dec 01 20:55:38 crc kubenswrapper[4802]: I1201 20:55:38.330856 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7ddz\" (UniqueName: \"kubernetes.io/projected/14d62371-fcf1-4de3-b48f-b8cbd79f8f97-kube-api-access-r7ddz\") pod \"crc-debug-97qnl\" (UID: \"14d62371-fcf1-4de3-b48f-b8cbd79f8f97\") " pod="openshift-must-gather-t6rhk/crc-debug-97qnl" Dec 01 20:55:38 crc kubenswrapper[4802]: I1201 20:55:38.458822 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6rhk/crc-debug-97qnl" Dec 01 20:55:38 crc kubenswrapper[4802]: I1201 20:55:38.572327 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6rhk/crc-debug-97qnl" event={"ID":"14d62371-fcf1-4de3-b48f-b8cbd79f8f97","Type":"ContainerStarted","Data":"8922fcce9590006c4ad606687b54b086105b0d89eecf6a3031db9818dced5965"} Dec 01 20:55:50 crc kubenswrapper[4802]: I1201 20:55:50.743642 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6rhk/crc-debug-97qnl" event={"ID":"14d62371-fcf1-4de3-b48f-b8cbd79f8f97","Type":"ContainerStarted","Data":"bab5ec7cce6cdfeec70c1ee85cf23ef90bd21d9802e82fa9bc4df99f234580ac"} Dec 01 20:55:50 crc kubenswrapper[4802]: I1201 20:55:50.770485 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t6rhk/crc-debug-97qnl" podStartSLOduration=0.926875657 podStartE2EDuration="12.77046415s" podCreationTimestamp="2025-12-01 20:55:38 +0000 UTC" firstStartedPulling="2025-12-01 20:55:38.499834099 +0000 UTC m=+3560.062393740" lastFinishedPulling="2025-12-01 20:55:50.343422592 +0000 UTC m=+3571.905982233" observedRunningTime="2025-12-01 20:55:50.766587429 +0000 UTC m=+3572.329147070" watchObservedRunningTime="2025-12-01 20:55:50.77046415 +0000 UTC m=+3572.333023801" Dec 01 20:55:58 crc kubenswrapper[4802]: I1201 20:55:58.088532 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:55:58 crc kubenswrapper[4802]: I1201 20:55:58.089039 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:56:28 crc kubenswrapper[4802]: I1201 20:56:28.088420 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 20:56:28 crc kubenswrapper[4802]: I1201 20:56:28.088985 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 20:56:28 crc kubenswrapper[4802]: I1201 20:56:28.089029 4802 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 20:56:28 crc kubenswrapper[4802]: I1201 20:56:28.090249 4802 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca"} pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 20:56:28 crc kubenswrapper[4802]: I1201 20:56:28.090379 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" containerID="cri-o://4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca" gracePeriod=600 Dec 01 20:56:28 crc kubenswrapper[4802]: E1201 20:56:28.223268 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:56:29 crc kubenswrapper[4802]: I1201 20:56:29.135458 4802 generic.go:334] "Generic (PLEG): container finished" podID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerID="4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca" exitCode=0 Dec 01 20:56:29 crc kubenswrapper[4802]: I1201 20:56:29.136366 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerDied","Data":"4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca"} Dec 01 20:56:29 crc kubenswrapper[4802]: I1201 20:56:29.136438 4802 scope.go:117] "RemoveContainer" containerID="f8b3c5182085db2dc50eb18e66872b4d95d4989747b298b4b3a4f8c464087706" Dec 01 20:56:29 crc kubenswrapper[4802]: I1201 20:56:29.140485 4802 scope.go:117] "RemoveContainer" containerID="4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca" Dec 01 20:56:29 crc kubenswrapper[4802]: E1201 20:56:29.140865 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:56:31 crc kubenswrapper[4802]: I1201 20:56:31.158488 4802 generic.go:334] "Generic (PLEG): container finished" podID="14d62371-fcf1-4de3-b48f-b8cbd79f8f97" containerID="bab5ec7cce6cdfeec70c1ee85cf23ef90bd21d9802e82fa9bc4df99f234580ac" exitCode=0 Dec 01 20:56:31 crc kubenswrapper[4802]: I1201 20:56:31.158534 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6rhk/crc-debug-97qnl" event={"ID":"14d62371-fcf1-4de3-b48f-b8cbd79f8f97","Type":"ContainerDied","Data":"bab5ec7cce6cdfeec70c1ee85cf23ef90bd21d9802e82fa9bc4df99f234580ac"} Dec 01 20:56:32 crc kubenswrapper[4802]: I1201 20:56:32.278904 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6rhk/crc-debug-97qnl" Dec 01 20:56:32 crc kubenswrapper[4802]: I1201 20:56:32.314000 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t6rhk/crc-debug-97qnl"] Dec 01 20:56:32 crc kubenswrapper[4802]: I1201 20:56:32.321940 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t6rhk/crc-debug-97qnl"] Dec 01 20:56:32 crc kubenswrapper[4802]: I1201 20:56:32.397675 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14d62371-fcf1-4de3-b48f-b8cbd79f8f97-host\") pod \"14d62371-fcf1-4de3-b48f-b8cbd79f8f97\" (UID: \"14d62371-fcf1-4de3-b48f-b8cbd79f8f97\") " Dec 01 20:56:32 crc kubenswrapper[4802]: I1201 20:56:32.397742 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7ddz\" (UniqueName: \"kubernetes.io/projected/14d62371-fcf1-4de3-b48f-b8cbd79f8f97-kube-api-access-r7ddz\") pod \"14d62371-fcf1-4de3-b48f-b8cbd79f8f97\" (UID: \"14d62371-fcf1-4de3-b48f-b8cbd79f8f97\") " Dec 01 20:56:32 crc kubenswrapper[4802]: I1201 20:56:32.397798 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14d62371-fcf1-4de3-b48f-b8cbd79f8f97-host" (OuterVolumeSpecName: "host") pod "14d62371-fcf1-4de3-b48f-b8cbd79f8f97" (UID: "14d62371-fcf1-4de3-b48f-b8cbd79f8f97"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:56:32 crc kubenswrapper[4802]: I1201 20:56:32.398183 4802 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14d62371-fcf1-4de3-b48f-b8cbd79f8f97-host\") on node \"crc\" DevicePath \"\"" Dec 01 20:56:32 crc kubenswrapper[4802]: I1201 20:56:32.406383 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14d62371-fcf1-4de3-b48f-b8cbd79f8f97-kube-api-access-r7ddz" (OuterVolumeSpecName: "kube-api-access-r7ddz") pod "14d62371-fcf1-4de3-b48f-b8cbd79f8f97" (UID: "14d62371-fcf1-4de3-b48f-b8cbd79f8f97"). InnerVolumeSpecName "kube-api-access-r7ddz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:56:32 crc kubenswrapper[4802]: I1201 20:56:32.499640 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7ddz\" (UniqueName: \"kubernetes.io/projected/14d62371-fcf1-4de3-b48f-b8cbd79f8f97-kube-api-access-r7ddz\") on node \"crc\" DevicePath \"\"" Dec 01 20:56:32 crc kubenswrapper[4802]: I1201 20:56:32.730455 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14d62371-fcf1-4de3-b48f-b8cbd79f8f97" path="/var/lib/kubelet/pods/14d62371-fcf1-4de3-b48f-b8cbd79f8f97/volumes" Dec 01 20:56:33 crc kubenswrapper[4802]: I1201 20:56:33.184485 4802 scope.go:117] "RemoveContainer" containerID="bab5ec7cce6cdfeec70c1ee85cf23ef90bd21d9802e82fa9bc4df99f234580ac" Dec 01 20:56:33 crc kubenswrapper[4802]: I1201 20:56:33.184535 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6rhk/crc-debug-97qnl" Dec 01 20:56:33 crc kubenswrapper[4802]: I1201 20:56:33.466263 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t6rhk/crc-debug-7trzm"] Dec 01 20:56:33 crc kubenswrapper[4802]: E1201 20:56:33.467706 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14d62371-fcf1-4de3-b48f-b8cbd79f8f97" containerName="container-00" Dec 01 20:56:33 crc kubenswrapper[4802]: I1201 20:56:33.467755 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="14d62371-fcf1-4de3-b48f-b8cbd79f8f97" containerName="container-00" Dec 01 20:56:33 crc kubenswrapper[4802]: I1201 20:56:33.467969 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="14d62371-fcf1-4de3-b48f-b8cbd79f8f97" containerName="container-00" Dec 01 20:56:33 crc kubenswrapper[4802]: I1201 20:56:33.468724 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6rhk/crc-debug-7trzm" Dec 01 20:56:33 crc kubenswrapper[4802]: I1201 20:56:33.471388 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-t6rhk"/"default-dockercfg-slhsj" Dec 01 20:56:33 crc kubenswrapper[4802]: I1201 20:56:33.519342 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5-host\") pod \"crc-debug-7trzm\" (UID: \"6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5\") " pod="openshift-must-gather-t6rhk/crc-debug-7trzm" Dec 01 20:56:33 crc kubenswrapper[4802]: I1201 20:56:33.519388 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgpsg\" (UniqueName: \"kubernetes.io/projected/6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5-kube-api-access-dgpsg\") pod \"crc-debug-7trzm\" (UID: \"6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5\") " pod="openshift-must-gather-t6rhk/crc-debug-7trzm" Dec 01 20:56:33 crc kubenswrapper[4802]: I1201 20:56:33.621506 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5-host\") pod \"crc-debug-7trzm\" (UID: \"6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5\") " pod="openshift-must-gather-t6rhk/crc-debug-7trzm" Dec 01 20:56:33 crc kubenswrapper[4802]: I1201 20:56:33.621596 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgpsg\" (UniqueName: \"kubernetes.io/projected/6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5-kube-api-access-dgpsg\") pod \"crc-debug-7trzm\" (UID: \"6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5\") " pod="openshift-must-gather-t6rhk/crc-debug-7trzm" Dec 01 20:56:33 crc kubenswrapper[4802]: I1201 20:56:33.621664 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5-host\") pod \"crc-debug-7trzm\" (UID: \"6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5\") " pod="openshift-must-gather-t6rhk/crc-debug-7trzm" Dec 01 20:56:33 crc kubenswrapper[4802]: I1201 20:56:33.638998 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgpsg\" (UniqueName: \"kubernetes.io/projected/6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5-kube-api-access-dgpsg\") pod \"crc-debug-7trzm\" (UID: \"6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5\") " pod="openshift-must-gather-t6rhk/crc-debug-7trzm" Dec 01 20:56:33 crc kubenswrapper[4802]: I1201 20:56:33.785997 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6rhk/crc-debug-7trzm" Dec 01 20:56:34 crc kubenswrapper[4802]: I1201 20:56:34.195333 4802 generic.go:334] "Generic (PLEG): container finished" podID="6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5" containerID="65d8c45d5d820fa0b45235f5349129e7536a5ba2ebcf2b3d6c5486312808ef05" exitCode=0 Dec 01 20:56:34 crc kubenswrapper[4802]: I1201 20:56:34.195409 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6rhk/crc-debug-7trzm" event={"ID":"6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5","Type":"ContainerDied","Data":"65d8c45d5d820fa0b45235f5349129e7536a5ba2ebcf2b3d6c5486312808ef05"} Dec 01 20:56:34 crc kubenswrapper[4802]: I1201 20:56:34.195683 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6rhk/crc-debug-7trzm" event={"ID":"6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5","Type":"ContainerStarted","Data":"61b23605d659b6af3e19f41ae2c63e11995704427d2197e2e9cf023eca31b8bd"} Dec 01 20:56:34 crc kubenswrapper[4802]: I1201 20:56:34.711666 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t6rhk/crc-debug-7trzm"] Dec 01 20:56:34 crc kubenswrapper[4802]: I1201 20:56:34.729610 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t6rhk/crc-debug-7trzm"] Dec 01 20:56:35 crc kubenswrapper[4802]: I1201 20:56:35.298836 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6rhk/crc-debug-7trzm" Dec 01 20:56:35 crc kubenswrapper[4802]: I1201 20:56:35.455838 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgpsg\" (UniqueName: \"kubernetes.io/projected/6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5-kube-api-access-dgpsg\") pod \"6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5\" (UID: \"6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5\") " Dec 01 20:56:35 crc kubenswrapper[4802]: I1201 20:56:35.455882 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5-host\") pod \"6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5\" (UID: \"6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5\") " Dec 01 20:56:35 crc kubenswrapper[4802]: I1201 20:56:35.456045 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5-host" (OuterVolumeSpecName: "host") pod "6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5" (UID: "6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:56:35 crc kubenswrapper[4802]: I1201 20:56:35.456435 4802 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5-host\") on node \"crc\" DevicePath \"\"" Dec 01 20:56:35 crc kubenswrapper[4802]: I1201 20:56:35.461516 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5-kube-api-access-dgpsg" (OuterVolumeSpecName: "kube-api-access-dgpsg") pod "6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5" (UID: "6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5"). InnerVolumeSpecName "kube-api-access-dgpsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:56:35 crc kubenswrapper[4802]: I1201 20:56:35.558859 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgpsg\" (UniqueName: \"kubernetes.io/projected/6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5-kube-api-access-dgpsg\") on node \"crc\" DevicePath \"\"" Dec 01 20:56:35 crc kubenswrapper[4802]: I1201 20:56:35.934790 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t6rhk/crc-debug-zfz4t"] Dec 01 20:56:35 crc kubenswrapper[4802]: E1201 20:56:35.935259 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5" containerName="container-00" Dec 01 20:56:35 crc kubenswrapper[4802]: I1201 20:56:35.935274 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5" containerName="container-00" Dec 01 20:56:35 crc kubenswrapper[4802]: I1201 20:56:35.935477 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5" containerName="container-00" Dec 01 20:56:35 crc kubenswrapper[4802]: I1201 20:56:35.936090 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6rhk/crc-debug-zfz4t" Dec 01 20:56:36 crc kubenswrapper[4802]: I1201 20:56:36.068669 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddqvq\" (UniqueName: \"kubernetes.io/projected/4614a55f-c552-4c14-b946-9e1b0674e4d2-kube-api-access-ddqvq\") pod \"crc-debug-zfz4t\" (UID: \"4614a55f-c552-4c14-b946-9e1b0674e4d2\") " pod="openshift-must-gather-t6rhk/crc-debug-zfz4t" Dec 01 20:56:36 crc kubenswrapper[4802]: I1201 20:56:36.068773 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4614a55f-c552-4c14-b946-9e1b0674e4d2-host\") pod \"crc-debug-zfz4t\" (UID: \"4614a55f-c552-4c14-b946-9e1b0674e4d2\") " pod="openshift-must-gather-t6rhk/crc-debug-zfz4t" Dec 01 20:56:36 crc kubenswrapper[4802]: I1201 20:56:36.172525 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddqvq\" (UniqueName: \"kubernetes.io/projected/4614a55f-c552-4c14-b946-9e1b0674e4d2-kube-api-access-ddqvq\") pod \"crc-debug-zfz4t\" (UID: \"4614a55f-c552-4c14-b946-9e1b0674e4d2\") " pod="openshift-must-gather-t6rhk/crc-debug-zfz4t" Dec 01 20:56:36 crc kubenswrapper[4802]: I1201 20:56:36.172633 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4614a55f-c552-4c14-b946-9e1b0674e4d2-host\") pod \"crc-debug-zfz4t\" (UID: \"4614a55f-c552-4c14-b946-9e1b0674e4d2\") " pod="openshift-must-gather-t6rhk/crc-debug-zfz4t" Dec 01 20:56:36 crc kubenswrapper[4802]: I1201 20:56:36.172873 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4614a55f-c552-4c14-b946-9e1b0674e4d2-host\") pod \"crc-debug-zfz4t\" (UID: \"4614a55f-c552-4c14-b946-9e1b0674e4d2\") " pod="openshift-must-gather-t6rhk/crc-debug-zfz4t" Dec 01 20:56:36 crc kubenswrapper[4802]: I1201 20:56:36.193918 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddqvq\" (UniqueName: \"kubernetes.io/projected/4614a55f-c552-4c14-b946-9e1b0674e4d2-kube-api-access-ddqvq\") pod \"crc-debug-zfz4t\" (UID: \"4614a55f-c552-4c14-b946-9e1b0674e4d2\") " pod="openshift-must-gather-t6rhk/crc-debug-zfz4t" Dec 01 20:56:36 crc kubenswrapper[4802]: I1201 20:56:36.222538 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61b23605d659b6af3e19f41ae2c63e11995704427d2197e2e9cf023eca31b8bd" Dec 01 20:56:36 crc kubenswrapper[4802]: I1201 20:56:36.222676 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6rhk/crc-debug-7trzm" Dec 01 20:56:36 crc kubenswrapper[4802]: I1201 20:56:36.258016 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6rhk/crc-debug-zfz4t" Dec 01 20:56:36 crc kubenswrapper[4802]: W1201 20:56:36.294291 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4614a55f_c552_4c14_b946_9e1b0674e4d2.slice/crio-a06edf46b333726a8c25e6a2aff759549b6d9deb52c949648738f8319868111b WatchSource:0}: Error finding container a06edf46b333726a8c25e6a2aff759549b6d9deb52c949648738f8319868111b: Status 404 returned error can't find the container with id a06edf46b333726a8c25e6a2aff759549b6d9deb52c949648738f8319868111b Dec 01 20:56:36 crc kubenswrapper[4802]: I1201 20:56:36.737007 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5" path="/var/lib/kubelet/pods/6ec23f59-dc6c-4d3c-9b06-e23e179b6ce5/volumes" Dec 01 20:56:37 crc kubenswrapper[4802]: I1201 20:56:37.232136 4802 generic.go:334] "Generic (PLEG): container finished" podID="4614a55f-c552-4c14-b946-9e1b0674e4d2" containerID="ecc0b9ec42be3193fc30f0a64afbe564844770012f3bb7b52ad786823da6979e" exitCode=0 Dec 01 20:56:37 crc kubenswrapper[4802]: I1201 20:56:37.232229 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6rhk/crc-debug-zfz4t" event={"ID":"4614a55f-c552-4c14-b946-9e1b0674e4d2","Type":"ContainerDied","Data":"ecc0b9ec42be3193fc30f0a64afbe564844770012f3bb7b52ad786823da6979e"} Dec 01 20:56:37 crc kubenswrapper[4802]: I1201 20:56:37.232464 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6rhk/crc-debug-zfz4t" event={"ID":"4614a55f-c552-4c14-b946-9e1b0674e4d2","Type":"ContainerStarted","Data":"a06edf46b333726a8c25e6a2aff759549b6d9deb52c949648738f8319868111b"} Dec 01 20:56:37 crc kubenswrapper[4802]: I1201 20:56:37.273232 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t6rhk/crc-debug-zfz4t"] Dec 01 20:56:37 crc kubenswrapper[4802]: I1201 20:56:37.281529 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t6rhk/crc-debug-zfz4t"] Dec 01 20:56:38 crc kubenswrapper[4802]: I1201 20:56:38.385855 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6rhk/crc-debug-zfz4t" Dec 01 20:56:38 crc kubenswrapper[4802]: I1201 20:56:38.531474 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4614a55f-c552-4c14-b946-9e1b0674e4d2-host\") pod \"4614a55f-c552-4c14-b946-9e1b0674e4d2\" (UID: \"4614a55f-c552-4c14-b946-9e1b0674e4d2\") " Dec 01 20:56:38 crc kubenswrapper[4802]: I1201 20:56:38.531579 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4614a55f-c552-4c14-b946-9e1b0674e4d2-host" (OuterVolumeSpecName: "host") pod "4614a55f-c552-4c14-b946-9e1b0674e4d2" (UID: "4614a55f-c552-4c14-b946-9e1b0674e4d2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 20:56:38 crc kubenswrapper[4802]: I1201 20:56:38.531935 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddqvq\" (UniqueName: \"kubernetes.io/projected/4614a55f-c552-4c14-b946-9e1b0674e4d2-kube-api-access-ddqvq\") pod \"4614a55f-c552-4c14-b946-9e1b0674e4d2\" (UID: \"4614a55f-c552-4c14-b946-9e1b0674e4d2\") " Dec 01 20:56:38 crc kubenswrapper[4802]: I1201 20:56:38.532818 4802 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4614a55f-c552-4c14-b946-9e1b0674e4d2-host\") on node \"crc\" DevicePath \"\"" Dec 01 20:56:38 crc kubenswrapper[4802]: I1201 20:56:38.544481 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4614a55f-c552-4c14-b946-9e1b0674e4d2-kube-api-access-ddqvq" (OuterVolumeSpecName: "kube-api-access-ddqvq") pod "4614a55f-c552-4c14-b946-9e1b0674e4d2" (UID: "4614a55f-c552-4c14-b946-9e1b0674e4d2"). InnerVolumeSpecName "kube-api-access-ddqvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:56:38 crc kubenswrapper[4802]: I1201 20:56:38.634964 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddqvq\" (UniqueName: \"kubernetes.io/projected/4614a55f-c552-4c14-b946-9e1b0674e4d2-kube-api-access-ddqvq\") on node \"crc\" DevicePath \"\"" Dec 01 20:56:38 crc kubenswrapper[4802]: I1201 20:56:38.734867 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4614a55f-c552-4c14-b946-9e1b0674e4d2" path="/var/lib/kubelet/pods/4614a55f-c552-4c14-b946-9e1b0674e4d2/volumes" Dec 01 20:56:39 crc kubenswrapper[4802]: I1201 20:56:39.253860 4802 scope.go:117] "RemoveContainer" containerID="ecc0b9ec42be3193fc30f0a64afbe564844770012f3bb7b52ad786823da6979e" Dec 01 20:56:39 crc kubenswrapper[4802]: I1201 20:56:39.253874 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6rhk/crc-debug-zfz4t" Dec 01 20:56:39 crc kubenswrapper[4802]: I1201 20:56:39.720383 4802 scope.go:117] "RemoveContainer" containerID="4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca" Dec 01 20:56:39 crc kubenswrapper[4802]: E1201 20:56:39.721071 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:56:47 crc kubenswrapper[4802]: I1201 20:56:47.699108 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lvk56"] Dec 01 20:56:47 crc kubenswrapper[4802]: E1201 20:56:47.700108 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4614a55f-c552-4c14-b946-9e1b0674e4d2" containerName="container-00" Dec 01 20:56:47 crc kubenswrapper[4802]: I1201 20:56:47.700126 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="4614a55f-c552-4c14-b946-9e1b0674e4d2" containerName="container-00" Dec 01 20:56:47 crc kubenswrapper[4802]: I1201 20:56:47.700414 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="4614a55f-c552-4c14-b946-9e1b0674e4d2" containerName="container-00" Dec 01 20:56:47 crc kubenswrapper[4802]: I1201 20:56:47.702187 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lvk56" Dec 01 20:56:47 crc kubenswrapper[4802]: I1201 20:56:47.729610 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lvk56"] Dec 01 20:56:47 crc kubenswrapper[4802]: I1201 20:56:47.852611 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e66e0249-9adc-4cb7-b1ed-c328fd1d640b-utilities\") pod \"certified-operators-lvk56\" (UID: \"e66e0249-9adc-4cb7-b1ed-c328fd1d640b\") " pod="openshift-marketplace/certified-operators-lvk56" Dec 01 20:56:47 crc kubenswrapper[4802]: I1201 20:56:47.852719 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e66e0249-9adc-4cb7-b1ed-c328fd1d640b-catalog-content\") pod \"certified-operators-lvk56\" (UID: \"e66e0249-9adc-4cb7-b1ed-c328fd1d640b\") " pod="openshift-marketplace/certified-operators-lvk56" Dec 01 20:56:47 crc kubenswrapper[4802]: I1201 20:56:47.852940 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmlrl\" (UniqueName: \"kubernetes.io/projected/e66e0249-9adc-4cb7-b1ed-c328fd1d640b-kube-api-access-zmlrl\") pod \"certified-operators-lvk56\" (UID: \"e66e0249-9adc-4cb7-b1ed-c328fd1d640b\") " pod="openshift-marketplace/certified-operators-lvk56" Dec 01 20:56:47 crc kubenswrapper[4802]: I1201 20:56:47.955306 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e66e0249-9adc-4cb7-b1ed-c328fd1d640b-utilities\") pod \"certified-operators-lvk56\" (UID: \"e66e0249-9adc-4cb7-b1ed-c328fd1d640b\") " pod="openshift-marketplace/certified-operators-lvk56" Dec 01 20:56:47 crc kubenswrapper[4802]: I1201 20:56:47.955380 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e66e0249-9adc-4cb7-b1ed-c328fd1d640b-catalog-content\") pod \"certified-operators-lvk56\" (UID: \"e66e0249-9adc-4cb7-b1ed-c328fd1d640b\") " pod="openshift-marketplace/certified-operators-lvk56" Dec 01 20:56:47 crc kubenswrapper[4802]: I1201 20:56:47.955416 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmlrl\" (UniqueName: \"kubernetes.io/projected/e66e0249-9adc-4cb7-b1ed-c328fd1d640b-kube-api-access-zmlrl\") pod \"certified-operators-lvk56\" (UID: \"e66e0249-9adc-4cb7-b1ed-c328fd1d640b\") " pod="openshift-marketplace/certified-operators-lvk56" Dec 01 20:56:47 crc kubenswrapper[4802]: I1201 20:56:47.955951 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e66e0249-9adc-4cb7-b1ed-c328fd1d640b-utilities\") pod \"certified-operators-lvk56\" (UID: \"e66e0249-9adc-4cb7-b1ed-c328fd1d640b\") " pod="openshift-marketplace/certified-operators-lvk56" Dec 01 20:56:47 crc kubenswrapper[4802]: I1201 20:56:47.955972 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e66e0249-9adc-4cb7-b1ed-c328fd1d640b-catalog-content\") pod \"certified-operators-lvk56\" (UID: \"e66e0249-9adc-4cb7-b1ed-c328fd1d640b\") " pod="openshift-marketplace/certified-operators-lvk56" Dec 01 20:56:47 crc kubenswrapper[4802]: I1201 20:56:47.980903 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmlrl\" (UniqueName: \"kubernetes.io/projected/e66e0249-9adc-4cb7-b1ed-c328fd1d640b-kube-api-access-zmlrl\") pod \"certified-operators-lvk56\" (UID: \"e66e0249-9adc-4cb7-b1ed-c328fd1d640b\") " pod="openshift-marketplace/certified-operators-lvk56" Dec 01 20:56:48 crc kubenswrapper[4802]: I1201 20:56:48.028533 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lvk56" Dec 01 20:56:48 crc kubenswrapper[4802]: I1201 20:56:48.558331 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lvk56"] Dec 01 20:56:49 crc kubenswrapper[4802]: I1201 20:56:49.361292 4802 generic.go:334] "Generic (PLEG): container finished" podID="e66e0249-9adc-4cb7-b1ed-c328fd1d640b" containerID="d15c41a469bddbbf6353af0c5ed6098abfc971a75ea0cf7b6c87d072c2cf2c89" exitCode=0 Dec 01 20:56:49 crc kubenswrapper[4802]: I1201 20:56:49.361625 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvk56" event={"ID":"e66e0249-9adc-4cb7-b1ed-c328fd1d640b","Type":"ContainerDied","Data":"d15c41a469bddbbf6353af0c5ed6098abfc971a75ea0cf7b6c87d072c2cf2c89"} Dec 01 20:56:49 crc kubenswrapper[4802]: I1201 20:56:49.361659 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvk56" event={"ID":"e66e0249-9adc-4cb7-b1ed-c328fd1d640b","Type":"ContainerStarted","Data":"213755e83b3e965f886a4b6df4868e0fd71adefb23d0b8839a223be108a56f07"} Dec 01 20:56:50 crc kubenswrapper[4802]: I1201 20:56:50.691412 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6wbgr"] Dec 01 20:56:50 crc kubenswrapper[4802]: I1201 20:56:50.695869 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6wbgr" Dec 01 20:56:50 crc kubenswrapper[4802]: I1201 20:56:50.740007 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6wbgr"] Dec 01 20:56:50 crc kubenswrapper[4802]: I1201 20:56:50.824858 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqsbz\" (UniqueName: \"kubernetes.io/projected/de78ca73-365f-4dbd-8d2e-816d89eb6d79-kube-api-access-zqsbz\") pod \"redhat-operators-6wbgr\" (UID: \"de78ca73-365f-4dbd-8d2e-816d89eb6d79\") " pod="openshift-marketplace/redhat-operators-6wbgr" Dec 01 20:56:50 crc kubenswrapper[4802]: I1201 20:56:50.824908 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de78ca73-365f-4dbd-8d2e-816d89eb6d79-utilities\") pod \"redhat-operators-6wbgr\" (UID: \"de78ca73-365f-4dbd-8d2e-816d89eb6d79\") " pod="openshift-marketplace/redhat-operators-6wbgr" Dec 01 20:56:50 crc kubenswrapper[4802]: I1201 20:56:50.825064 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de78ca73-365f-4dbd-8d2e-816d89eb6d79-catalog-content\") pod \"redhat-operators-6wbgr\" (UID: \"de78ca73-365f-4dbd-8d2e-816d89eb6d79\") " pod="openshift-marketplace/redhat-operators-6wbgr" Dec 01 20:56:50 crc kubenswrapper[4802]: I1201 20:56:50.927171 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqsbz\" (UniqueName: \"kubernetes.io/projected/de78ca73-365f-4dbd-8d2e-816d89eb6d79-kube-api-access-zqsbz\") pod \"redhat-operators-6wbgr\" (UID: \"de78ca73-365f-4dbd-8d2e-816d89eb6d79\") " pod="openshift-marketplace/redhat-operators-6wbgr" Dec 01 20:56:50 crc kubenswrapper[4802]: I1201 20:56:50.927475 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de78ca73-365f-4dbd-8d2e-816d89eb6d79-utilities\") pod \"redhat-operators-6wbgr\" (UID: \"de78ca73-365f-4dbd-8d2e-816d89eb6d79\") " pod="openshift-marketplace/redhat-operators-6wbgr" Dec 01 20:56:50 crc kubenswrapper[4802]: I1201 20:56:50.927721 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de78ca73-365f-4dbd-8d2e-816d89eb6d79-catalog-content\") pod \"redhat-operators-6wbgr\" (UID: \"de78ca73-365f-4dbd-8d2e-816d89eb6d79\") " pod="openshift-marketplace/redhat-operators-6wbgr" Dec 01 20:56:50 crc kubenswrapper[4802]: I1201 20:56:50.928455 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de78ca73-365f-4dbd-8d2e-816d89eb6d79-utilities\") pod \"redhat-operators-6wbgr\" (UID: \"de78ca73-365f-4dbd-8d2e-816d89eb6d79\") " pod="openshift-marketplace/redhat-operators-6wbgr" Dec 01 20:56:50 crc kubenswrapper[4802]: I1201 20:56:50.928625 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de78ca73-365f-4dbd-8d2e-816d89eb6d79-catalog-content\") pod \"redhat-operators-6wbgr\" (UID: \"de78ca73-365f-4dbd-8d2e-816d89eb6d79\") " pod="openshift-marketplace/redhat-operators-6wbgr" Dec 01 20:56:50 crc kubenswrapper[4802]: I1201 20:56:50.949930 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqsbz\" (UniqueName: \"kubernetes.io/projected/de78ca73-365f-4dbd-8d2e-816d89eb6d79-kube-api-access-zqsbz\") pod \"redhat-operators-6wbgr\" (UID: \"de78ca73-365f-4dbd-8d2e-816d89eb6d79\") " pod="openshift-marketplace/redhat-operators-6wbgr" Dec 01 20:56:51 crc kubenswrapper[4802]: I1201 20:56:51.091175 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6wbgr" Dec 01 20:56:51 crc kubenswrapper[4802]: I1201 20:56:51.388089 4802 generic.go:334] "Generic (PLEG): container finished" podID="e66e0249-9adc-4cb7-b1ed-c328fd1d640b" containerID="018e5f10a1b05d6298c4a43eb193e29068394a263084665a12d639b8fd877728" exitCode=0 Dec 01 20:56:51 crc kubenswrapper[4802]: I1201 20:56:51.388322 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvk56" event={"ID":"e66e0249-9adc-4cb7-b1ed-c328fd1d640b","Type":"ContainerDied","Data":"018e5f10a1b05d6298c4a43eb193e29068394a263084665a12d639b8fd877728"} Dec 01 20:56:51 crc kubenswrapper[4802]: W1201 20:56:51.591772 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde78ca73_365f_4dbd_8d2e_816d89eb6d79.slice/crio-5255db6e067e5c6af5f8147ba08c65280f710e83137e1bbe8393e4a33c73d247 WatchSource:0}: Error finding container 5255db6e067e5c6af5f8147ba08c65280f710e83137e1bbe8393e4a33c73d247: Status 404 returned error can't find the container with id 5255db6e067e5c6af5f8147ba08c65280f710e83137e1bbe8393e4a33c73d247 Dec 01 20:56:51 crc kubenswrapper[4802]: I1201 20:56:51.593104 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6wbgr"] Dec 01 20:56:52 crc kubenswrapper[4802]: I1201 20:56:52.409429 4802 generic.go:334] "Generic (PLEG): container finished" podID="de78ca73-365f-4dbd-8d2e-816d89eb6d79" containerID="55b099841bcdccbd164c9866923e9c3495812ea899f93977f2c44b1f599630e9" exitCode=0 Dec 01 20:56:52 crc kubenswrapper[4802]: I1201 20:56:52.409520 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wbgr" event={"ID":"de78ca73-365f-4dbd-8d2e-816d89eb6d79","Type":"ContainerDied","Data":"55b099841bcdccbd164c9866923e9c3495812ea899f93977f2c44b1f599630e9"} Dec 01 20:56:52 crc kubenswrapper[4802]: I1201 20:56:52.410218 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wbgr" event={"ID":"de78ca73-365f-4dbd-8d2e-816d89eb6d79","Type":"ContainerStarted","Data":"5255db6e067e5c6af5f8147ba08c65280f710e83137e1bbe8393e4a33c73d247"} Dec 01 20:56:54 crc kubenswrapper[4802]: I1201 20:56:54.430698 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvk56" event={"ID":"e66e0249-9adc-4cb7-b1ed-c328fd1d640b","Type":"ContainerStarted","Data":"d1deb8c80ced896d93041c72cb08346fa536a5dbe7ea36fa9b6a324796be8110"} Dec 01 20:56:54 crc kubenswrapper[4802]: I1201 20:56:54.436534 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wbgr" event={"ID":"de78ca73-365f-4dbd-8d2e-816d89eb6d79","Type":"ContainerStarted","Data":"10829188cff139c4845461e11c4c7ab29971c4669310ceb1f3414078d1c7285c"} Dec 01 20:56:54 crc kubenswrapper[4802]: I1201 20:56:54.452834 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lvk56" podStartSLOduration=3.522463126 podStartE2EDuration="7.452816434s" podCreationTimestamp="2025-12-01 20:56:47 +0000 UTC" firstStartedPulling="2025-12-01 20:56:49.365325206 +0000 UTC m=+3630.927884887" lastFinishedPulling="2025-12-01 20:56:53.295678554 +0000 UTC m=+3634.858238195" observedRunningTime="2025-12-01 20:56:54.450576693 +0000 UTC m=+3636.013136334" watchObservedRunningTime="2025-12-01 20:56:54.452816434 +0000 UTC m=+3636.015376075" Dec 01 20:56:54 crc kubenswrapper[4802]: I1201 20:56:54.720475 4802 scope.go:117] "RemoveContainer" containerID="4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca" Dec 01 20:56:54 crc kubenswrapper[4802]: E1201 20:56:54.720794 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:56:57 crc kubenswrapper[4802]: I1201 20:56:57.465257 4802 generic.go:334] "Generic (PLEG): container finished" podID="de78ca73-365f-4dbd-8d2e-816d89eb6d79" containerID="10829188cff139c4845461e11c4c7ab29971c4669310ceb1f3414078d1c7285c" exitCode=0 Dec 01 20:56:57 crc kubenswrapper[4802]: I1201 20:56:57.465346 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wbgr" event={"ID":"de78ca73-365f-4dbd-8d2e-816d89eb6d79","Type":"ContainerDied","Data":"10829188cff139c4845461e11c4c7ab29971c4669310ceb1f3414078d1c7285c"} Dec 01 20:56:58 crc kubenswrapper[4802]: I1201 20:56:58.030668 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lvk56" Dec 01 20:56:58 crc kubenswrapper[4802]: I1201 20:56:58.031250 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lvk56" Dec 01 20:56:58 crc kubenswrapper[4802]: I1201 20:56:58.101989 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lvk56" Dec 01 20:56:58 crc kubenswrapper[4802]: I1201 20:56:58.476918 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wbgr" event={"ID":"de78ca73-365f-4dbd-8d2e-816d89eb6d79","Type":"ContainerStarted","Data":"cfb1ee621efa845a8f3eb1072fac1eefad9b1bcb18224a00f82313d7ee2bedcf"} Dec 01 20:56:58 crc kubenswrapper[4802]: I1201 20:56:58.509407 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6wbgr" podStartSLOduration=2.938076411 podStartE2EDuration="8.509383606s" podCreationTimestamp="2025-12-01 20:56:50 +0000 UTC" firstStartedPulling="2025-12-01 20:56:52.412335417 +0000 UTC m=+3633.974895058" lastFinishedPulling="2025-12-01 20:56:57.983642612 +0000 UTC m=+3639.546202253" observedRunningTime="2025-12-01 20:56:58.494085589 +0000 UTC m=+3640.056645230" watchObservedRunningTime="2025-12-01 20:56:58.509383606 +0000 UTC m=+3640.071943247" Dec 01 20:56:58 crc kubenswrapper[4802]: I1201 20:56:58.543175 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lvk56" Dec 01 20:56:59 crc kubenswrapper[4802]: I1201 20:56:59.487891 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lvk56"] Dec 01 20:57:00 crc kubenswrapper[4802]: I1201 20:57:00.491904 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lvk56" podUID="e66e0249-9adc-4cb7-b1ed-c328fd1d640b" containerName="registry-server" containerID="cri-o://d1deb8c80ced896d93041c72cb08346fa536a5dbe7ea36fa9b6a324796be8110" gracePeriod=2 Dec 01 20:57:00 crc kubenswrapper[4802]: I1201 20:57:00.963494 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lvk56" Dec 01 20:57:01 crc kubenswrapper[4802]: I1201 20:57:01.044322 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e66e0249-9adc-4cb7-b1ed-c328fd1d640b-catalog-content\") pod \"e66e0249-9adc-4cb7-b1ed-c328fd1d640b\" (UID: \"e66e0249-9adc-4cb7-b1ed-c328fd1d640b\") " Dec 01 20:57:01 crc kubenswrapper[4802]: I1201 20:57:01.044455 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmlrl\" (UniqueName: \"kubernetes.io/projected/e66e0249-9adc-4cb7-b1ed-c328fd1d640b-kube-api-access-zmlrl\") pod \"e66e0249-9adc-4cb7-b1ed-c328fd1d640b\" (UID: \"e66e0249-9adc-4cb7-b1ed-c328fd1d640b\") " Dec 01 20:57:01 crc kubenswrapper[4802]: I1201 20:57:01.044564 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e66e0249-9adc-4cb7-b1ed-c328fd1d640b-utilities\") pod \"e66e0249-9adc-4cb7-b1ed-c328fd1d640b\" (UID: \"e66e0249-9adc-4cb7-b1ed-c328fd1d640b\") " Dec 01 20:57:01 crc kubenswrapper[4802]: I1201 20:57:01.045128 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e66e0249-9adc-4cb7-b1ed-c328fd1d640b-utilities" (OuterVolumeSpecName: "utilities") pod "e66e0249-9adc-4cb7-b1ed-c328fd1d640b" (UID: "e66e0249-9adc-4cb7-b1ed-c328fd1d640b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:57:01 crc kubenswrapper[4802]: I1201 20:57:01.050720 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e66e0249-9adc-4cb7-b1ed-c328fd1d640b-kube-api-access-zmlrl" (OuterVolumeSpecName: "kube-api-access-zmlrl") pod "e66e0249-9adc-4cb7-b1ed-c328fd1d640b" (UID: "e66e0249-9adc-4cb7-b1ed-c328fd1d640b"). InnerVolumeSpecName "kube-api-access-zmlrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:57:01 crc kubenswrapper[4802]: I1201 20:57:01.093381 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6wbgr" Dec 01 20:57:01 crc kubenswrapper[4802]: I1201 20:57:01.093813 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6wbgr" Dec 01 20:57:01 crc kubenswrapper[4802]: I1201 20:57:01.117328 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e66e0249-9adc-4cb7-b1ed-c328fd1d640b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e66e0249-9adc-4cb7-b1ed-c328fd1d640b" (UID: "e66e0249-9adc-4cb7-b1ed-c328fd1d640b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:57:01 crc kubenswrapper[4802]: I1201 20:57:01.147068 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e66e0249-9adc-4cb7-b1ed-c328fd1d640b-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 20:57:01 crc kubenswrapper[4802]: I1201 20:57:01.147429 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e66e0249-9adc-4cb7-b1ed-c328fd1d640b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 20:57:01 crc kubenswrapper[4802]: I1201 20:57:01.147448 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmlrl\" (UniqueName: \"kubernetes.io/projected/e66e0249-9adc-4cb7-b1ed-c328fd1d640b-kube-api-access-zmlrl\") on node \"crc\" DevicePath \"\"" Dec 01 20:57:01 crc kubenswrapper[4802]: I1201 20:57:01.501173 4802 generic.go:334] "Generic (PLEG): container finished" podID="e66e0249-9adc-4cb7-b1ed-c328fd1d640b" containerID="d1deb8c80ced896d93041c72cb08346fa536a5dbe7ea36fa9b6a324796be8110" exitCode=0 Dec 01 20:57:01 crc kubenswrapper[4802]: I1201 20:57:01.501225 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvk56" event={"ID":"e66e0249-9adc-4cb7-b1ed-c328fd1d640b","Type":"ContainerDied","Data":"d1deb8c80ced896d93041c72cb08346fa536a5dbe7ea36fa9b6a324796be8110"} Dec 01 20:57:01 crc kubenswrapper[4802]: I1201 20:57:01.501269 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvk56" event={"ID":"e66e0249-9adc-4cb7-b1ed-c328fd1d640b","Type":"ContainerDied","Data":"213755e83b3e965f886a4b6df4868e0fd71adefb23d0b8839a223be108a56f07"} Dec 01 20:57:01 crc kubenswrapper[4802]: I1201 20:57:01.501286 4802 scope.go:117] "RemoveContainer" containerID="d1deb8c80ced896d93041c72cb08346fa536a5dbe7ea36fa9b6a324796be8110" Dec 01 20:57:01 crc kubenswrapper[4802]: I1201 20:57:01.509360 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lvk56" Dec 01 20:57:01 crc kubenswrapper[4802]: I1201 20:57:01.524719 4802 scope.go:117] "RemoveContainer" containerID="018e5f10a1b05d6298c4a43eb193e29068394a263084665a12d639b8fd877728" Dec 01 20:57:01 crc kubenswrapper[4802]: I1201 20:57:01.543638 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lvk56"] Dec 01 20:57:01 crc kubenswrapper[4802]: I1201 20:57:01.565404 4802 scope.go:117] "RemoveContainer" containerID="d15c41a469bddbbf6353af0c5ed6098abfc971a75ea0cf7b6c87d072c2cf2c89" Dec 01 20:57:01 crc kubenswrapper[4802]: I1201 20:57:01.579422 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lvk56"] Dec 01 20:57:01 crc kubenswrapper[4802]: I1201 20:57:01.599386 4802 scope.go:117] "RemoveContainer" containerID="d1deb8c80ced896d93041c72cb08346fa536a5dbe7ea36fa9b6a324796be8110" Dec 01 20:57:01 crc kubenswrapper[4802]: E1201 20:57:01.599933 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1deb8c80ced896d93041c72cb08346fa536a5dbe7ea36fa9b6a324796be8110\": container with ID starting with d1deb8c80ced896d93041c72cb08346fa536a5dbe7ea36fa9b6a324796be8110 not found: ID does not exist" containerID="d1deb8c80ced896d93041c72cb08346fa536a5dbe7ea36fa9b6a324796be8110" Dec 01 20:57:01 crc kubenswrapper[4802]: I1201 20:57:01.599978 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1deb8c80ced896d93041c72cb08346fa536a5dbe7ea36fa9b6a324796be8110"} err="failed to get container status \"d1deb8c80ced896d93041c72cb08346fa536a5dbe7ea36fa9b6a324796be8110\": rpc error: code = NotFound desc = could not find container \"d1deb8c80ced896d93041c72cb08346fa536a5dbe7ea36fa9b6a324796be8110\": container with ID starting with d1deb8c80ced896d93041c72cb08346fa536a5dbe7ea36fa9b6a324796be8110 not found: ID does not exist" Dec 01 20:57:01 crc kubenswrapper[4802]: I1201 20:57:01.600014 4802 scope.go:117] "RemoveContainer" containerID="018e5f10a1b05d6298c4a43eb193e29068394a263084665a12d639b8fd877728" Dec 01 20:57:01 crc kubenswrapper[4802]: E1201 20:57:01.600439 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"018e5f10a1b05d6298c4a43eb193e29068394a263084665a12d639b8fd877728\": container with ID starting with 018e5f10a1b05d6298c4a43eb193e29068394a263084665a12d639b8fd877728 not found: ID does not exist" containerID="018e5f10a1b05d6298c4a43eb193e29068394a263084665a12d639b8fd877728" Dec 01 20:57:01 crc kubenswrapper[4802]: I1201 20:57:01.600475 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"018e5f10a1b05d6298c4a43eb193e29068394a263084665a12d639b8fd877728"} err="failed to get container status \"018e5f10a1b05d6298c4a43eb193e29068394a263084665a12d639b8fd877728\": rpc error: code = NotFound desc = could not find container \"018e5f10a1b05d6298c4a43eb193e29068394a263084665a12d639b8fd877728\": container with ID starting with 018e5f10a1b05d6298c4a43eb193e29068394a263084665a12d639b8fd877728 not found: ID does not exist" Dec 01 20:57:01 crc kubenswrapper[4802]: I1201 20:57:01.600496 4802 scope.go:117] "RemoveContainer" containerID="d15c41a469bddbbf6353af0c5ed6098abfc971a75ea0cf7b6c87d072c2cf2c89" Dec 01 20:57:01 crc kubenswrapper[4802]: E1201 20:57:01.600768 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d15c41a469bddbbf6353af0c5ed6098abfc971a75ea0cf7b6c87d072c2cf2c89\": container with ID starting with d15c41a469bddbbf6353af0c5ed6098abfc971a75ea0cf7b6c87d072c2cf2c89 not found: ID does not exist" containerID="d15c41a469bddbbf6353af0c5ed6098abfc971a75ea0cf7b6c87d072c2cf2c89" Dec 01 20:57:01 crc kubenswrapper[4802]: I1201 20:57:01.600798 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d15c41a469bddbbf6353af0c5ed6098abfc971a75ea0cf7b6c87d072c2cf2c89"} err="failed to get container status \"d15c41a469bddbbf6353af0c5ed6098abfc971a75ea0cf7b6c87d072c2cf2c89\": rpc error: code = NotFound desc = could not find container \"d15c41a469bddbbf6353af0c5ed6098abfc971a75ea0cf7b6c87d072c2cf2c89\": container with ID starting with d15c41a469bddbbf6353af0c5ed6098abfc971a75ea0cf7b6c87d072c2cf2c89 not found: ID does not exist" Dec 01 20:57:02 crc kubenswrapper[4802]: I1201 20:57:02.136265 4802 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6wbgr" podUID="de78ca73-365f-4dbd-8d2e-816d89eb6d79" containerName="registry-server" probeResult="failure" output=< Dec 01 20:57:02 crc kubenswrapper[4802]: timeout: failed to connect service ":50051" within 1s Dec 01 20:57:02 crc kubenswrapper[4802]: > Dec 01 20:57:02 crc kubenswrapper[4802]: I1201 20:57:02.732533 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e66e0249-9adc-4cb7-b1ed-c328fd1d640b" path="/var/lib/kubelet/pods/e66e0249-9adc-4cb7-b1ed-c328fd1d640b/volumes" Dec 01 20:57:06 crc kubenswrapper[4802]: I1201 20:57:06.720781 4802 scope.go:117] "RemoveContainer" containerID="4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca" Dec 01 20:57:06 crc kubenswrapper[4802]: E1201 20:57:06.721631 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:57:11 crc kubenswrapper[4802]: I1201 20:57:11.135214 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6wbgr" Dec 01 20:57:11 crc kubenswrapper[4802]: I1201 20:57:11.190664 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6wbgr" Dec 01 20:57:11 crc kubenswrapper[4802]: I1201 20:57:11.377403 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6wbgr"] Dec 01 20:57:12 crc kubenswrapper[4802]: I1201 20:57:12.042419 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-69b4dccd58-q9lk2_2b86c57b-7125-4ead-88b7-7f5998651f39/barbican-api/0.log" Dec 01 20:57:12 crc kubenswrapper[4802]: I1201 20:57:12.143425 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-69b4dccd58-q9lk2_2b86c57b-7125-4ead-88b7-7f5998651f39/barbican-api-log/0.log" Dec 01 20:57:12 crc kubenswrapper[4802]: I1201 20:57:12.279736 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-67fcb6786-rkbj5_f9d802de-8a16-4fec-8768-b09841678cc8/barbican-keystone-listener/0.log" Dec 01 20:57:12 crc kubenswrapper[4802]: I1201 20:57:12.451370 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-67fcb6786-rkbj5_f9d802de-8a16-4fec-8768-b09841678cc8/barbican-keystone-listener-log/0.log" Dec 01 20:57:12 crc kubenswrapper[4802]: I1201 20:57:12.540694 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6f8997c475-6j472_7116c50f-a3ef-4975-9dca-2070fbdac59a/barbican-worker/0.log" Dec 01 20:57:12 crc kubenswrapper[4802]: I1201 20:57:12.556638 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6f8997c475-6j472_7116c50f-a3ef-4975-9dca-2070fbdac59a/barbican-worker-log/0.log" Dec 01 20:57:12 crc kubenswrapper[4802]: I1201 20:57:12.622980 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6wbgr" podUID="de78ca73-365f-4dbd-8d2e-816d89eb6d79" containerName="registry-server" containerID="cri-o://cfb1ee621efa845a8f3eb1072fac1eefad9b1bcb18224a00f82313d7ee2bedcf" gracePeriod=2 Dec 01 20:57:12 crc kubenswrapper[4802]: I1201 20:57:12.697840 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5_e04c0b98-f144-4917-be97-11a6b8f2b449/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 20:57:12 crc kubenswrapper[4802]: I1201 20:57:12.825414 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_bdefd2ec-84d7-4e92-adeb-969ad52e35b6/ceilometer-central-agent/0.log" Dec 01 20:57:12 crc kubenswrapper[4802]: I1201 20:57:12.910092 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_bdefd2ec-84d7-4e92-adeb-969ad52e35b6/proxy-httpd/0.log" Dec 01 20:57:12 crc kubenswrapper[4802]: I1201 20:57:12.953349 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_bdefd2ec-84d7-4e92-adeb-969ad52e35b6/ceilometer-notification-agent/0.log" Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.034557 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_bdefd2ec-84d7-4e92-adeb-969ad52e35b6/sg-core/0.log" Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.092572 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6wbgr" Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.166512 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp_7db48dd7-0156-4073-918a-5f4e4c1244d9/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.256865 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2_0b81c691-0a4b-48b6-b1e1-151e1cac847c/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.275255 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de78ca73-365f-4dbd-8d2e-816d89eb6d79-catalog-content\") pod \"de78ca73-365f-4dbd-8d2e-816d89eb6d79\" (UID: \"de78ca73-365f-4dbd-8d2e-816d89eb6d79\") " Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.275352 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqsbz\" (UniqueName: \"kubernetes.io/projected/de78ca73-365f-4dbd-8d2e-816d89eb6d79-kube-api-access-zqsbz\") pod \"de78ca73-365f-4dbd-8d2e-816d89eb6d79\" (UID: \"de78ca73-365f-4dbd-8d2e-816d89eb6d79\") " Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.275491 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de78ca73-365f-4dbd-8d2e-816d89eb6d79-utilities\") pod \"de78ca73-365f-4dbd-8d2e-816d89eb6d79\" (UID: \"de78ca73-365f-4dbd-8d2e-816d89eb6d79\") " Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.276238 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de78ca73-365f-4dbd-8d2e-816d89eb6d79-utilities" (OuterVolumeSpecName: "utilities") pod "de78ca73-365f-4dbd-8d2e-816d89eb6d79" (UID: "de78ca73-365f-4dbd-8d2e-816d89eb6d79"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.282033 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de78ca73-365f-4dbd-8d2e-816d89eb6d79-kube-api-access-zqsbz" (OuterVolumeSpecName: "kube-api-access-zqsbz") pod "de78ca73-365f-4dbd-8d2e-816d89eb6d79" (UID: "de78ca73-365f-4dbd-8d2e-816d89eb6d79"). InnerVolumeSpecName "kube-api-access-zqsbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.377807 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de78ca73-365f-4dbd-8d2e-816d89eb6d79-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.377836 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqsbz\" (UniqueName: \"kubernetes.io/projected/de78ca73-365f-4dbd-8d2e-816d89eb6d79-kube-api-access-zqsbz\") on node \"crc\" DevicePath \"\"" Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.387410 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de78ca73-365f-4dbd-8d2e-816d89eb6d79-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de78ca73-365f-4dbd-8d2e-816d89eb6d79" (UID: "de78ca73-365f-4dbd-8d2e-816d89eb6d79"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.405897 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_372989c2-e54c-4031-9b41-926f7be64266/cinder-api-log/0.log" Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.424836 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_372989c2-e54c-4031-9b41-926f7be64266/cinder-api/0.log" Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.479761 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de78ca73-365f-4dbd-8d2e-816d89eb6d79-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.644947 4802 generic.go:334] "Generic (PLEG): container finished" podID="de78ca73-365f-4dbd-8d2e-816d89eb6d79" containerID="cfb1ee621efa845a8f3eb1072fac1eefad9b1bcb18224a00f82313d7ee2bedcf" exitCode=0 Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.644999 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wbgr" event={"ID":"de78ca73-365f-4dbd-8d2e-816d89eb6d79","Type":"ContainerDied","Data":"cfb1ee621efa845a8f3eb1072fac1eefad9b1bcb18224a00f82313d7ee2bedcf"} Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.645031 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wbgr" event={"ID":"de78ca73-365f-4dbd-8d2e-816d89eb6d79","Type":"ContainerDied","Data":"5255db6e067e5c6af5f8147ba08c65280f710e83137e1bbe8393e4a33c73d247"} Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.645049 4802 scope.go:117] "RemoveContainer" containerID="cfb1ee621efa845a8f3eb1072fac1eefad9b1bcb18224a00f82313d7ee2bedcf" Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.645248 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6wbgr" Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.680764 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_cb0c455b-a5d4-41cf-87c3-673a3deac7cb/probe/0.log" Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.683508 4802 scope.go:117] "RemoveContainer" containerID="10829188cff139c4845461e11c4c7ab29971c4669310ceb1f3414078d1c7285c" Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.691311 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6wbgr"] Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.701298 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6wbgr"] Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.718432 4802 scope.go:117] "RemoveContainer" containerID="55b099841bcdccbd164c9866923e9c3495812ea899f93977f2c44b1f599630e9" Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.760911 4802 scope.go:117] "RemoveContainer" containerID="cfb1ee621efa845a8f3eb1072fac1eefad9b1bcb18224a00f82313d7ee2bedcf" Dec 01 20:57:13 crc kubenswrapper[4802]: E1201 20:57:13.761247 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfb1ee621efa845a8f3eb1072fac1eefad9b1bcb18224a00f82313d7ee2bedcf\": container with ID starting with cfb1ee621efa845a8f3eb1072fac1eefad9b1bcb18224a00f82313d7ee2bedcf not found: ID does not exist" containerID="cfb1ee621efa845a8f3eb1072fac1eefad9b1bcb18224a00f82313d7ee2bedcf" Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.761287 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfb1ee621efa845a8f3eb1072fac1eefad9b1bcb18224a00f82313d7ee2bedcf"} err="failed to get container status \"cfb1ee621efa845a8f3eb1072fac1eefad9b1bcb18224a00f82313d7ee2bedcf\": rpc error: code = NotFound desc = could not find container \"cfb1ee621efa845a8f3eb1072fac1eefad9b1bcb18224a00f82313d7ee2bedcf\": container with ID starting with cfb1ee621efa845a8f3eb1072fac1eefad9b1bcb18224a00f82313d7ee2bedcf not found: ID does not exist" Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.761313 4802 scope.go:117] "RemoveContainer" containerID="10829188cff139c4845461e11c4c7ab29971c4669310ceb1f3414078d1c7285c" Dec 01 20:57:13 crc kubenswrapper[4802]: E1201 20:57:13.765579 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10829188cff139c4845461e11c4c7ab29971c4669310ceb1f3414078d1c7285c\": container with ID starting with 10829188cff139c4845461e11c4c7ab29971c4669310ceb1f3414078d1c7285c not found: ID does not exist" containerID="10829188cff139c4845461e11c4c7ab29971c4669310ceb1f3414078d1c7285c" Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.765675 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10829188cff139c4845461e11c4c7ab29971c4669310ceb1f3414078d1c7285c"} err="failed to get container status \"10829188cff139c4845461e11c4c7ab29971c4669310ceb1f3414078d1c7285c\": rpc error: code = NotFound desc = could not find container \"10829188cff139c4845461e11c4c7ab29971c4669310ceb1f3414078d1c7285c\": container with ID starting with 10829188cff139c4845461e11c4c7ab29971c4669310ceb1f3414078d1c7285c not found: ID does not exist" Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.765757 4802 scope.go:117] "RemoveContainer" containerID="55b099841bcdccbd164c9866923e9c3495812ea899f93977f2c44b1f599630e9" Dec 01 20:57:13 crc kubenswrapper[4802]: E1201 20:57:13.768562 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55b099841bcdccbd164c9866923e9c3495812ea899f93977f2c44b1f599630e9\": container with ID starting with 55b099841bcdccbd164c9866923e9c3495812ea899f93977f2c44b1f599630e9 not found: ID does not exist" containerID="55b099841bcdccbd164c9866923e9c3495812ea899f93977f2c44b1f599630e9" Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.768669 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55b099841bcdccbd164c9866923e9c3495812ea899f93977f2c44b1f599630e9"} err="failed to get container status \"55b099841bcdccbd164c9866923e9c3495812ea899f93977f2c44b1f599630e9\": rpc error: code = NotFound desc = could not find container \"55b099841bcdccbd164c9866923e9c3495812ea899f93977f2c44b1f599630e9\": container with ID starting with 55b099841bcdccbd164c9866923e9c3495812ea899f93977f2c44b1f599630e9 not found: ID does not exist" Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.811476 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_cb0c455b-a5d4-41cf-87c3-673a3deac7cb/cinder-backup/0.log" Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.820549 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0/cinder-scheduler/0.log" Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.944325 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0/probe/0.log" Dec 01 20:57:13 crc kubenswrapper[4802]: I1201 20:57:13.987646 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_83603474-dc08-4ea8-a158-cba205dab6da/probe/0.log" Dec 01 20:57:14 crc kubenswrapper[4802]: I1201 20:57:14.084287 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_83603474-dc08-4ea8-a158-cba205dab6da/cinder-volume/0.log" Dec 01 20:57:14 crc kubenswrapper[4802]: I1201 20:57:14.176034 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-g62m8_3a154c18-5d93-4d73-9e97-90fb21082eea/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 20:57:14 crc kubenswrapper[4802]: I1201 20:57:14.317724 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz_5a9ae28b-9e09-4918-b72b-e22abd2e6dec/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 20:57:14 crc kubenswrapper[4802]: I1201 20:57:14.460378 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-jwp25_66a9cb74-956c-4846-91b9-a4dac0834347/init/0.log" Dec 01 20:57:14 crc kubenswrapper[4802]: I1201 20:57:14.586289 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-jwp25_66a9cb74-956c-4846-91b9-a4dac0834347/init/0.log" Dec 01 20:57:14 crc kubenswrapper[4802]: I1201 20:57:14.598846 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-jwp25_66a9cb74-956c-4846-91b9-a4dac0834347/dnsmasq-dns/0.log" Dec 01 20:57:14 crc kubenswrapper[4802]: I1201 20:57:14.677650 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_30bb57c2-94ae-48ad-9e68-0b595b58246b/glance-httpd/0.log" Dec 01 20:57:14 crc kubenswrapper[4802]: I1201 20:57:14.731279 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de78ca73-365f-4dbd-8d2e-816d89eb6d79" path="/var/lib/kubelet/pods/de78ca73-365f-4dbd-8d2e-816d89eb6d79/volumes" Dec 01 20:57:14 crc kubenswrapper[4802]: I1201 20:57:14.800637 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_30bb57c2-94ae-48ad-9e68-0b595b58246b/glance-log/0.log" Dec 01 20:57:14 crc kubenswrapper[4802]: I1201 20:57:14.838632 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae/glance-httpd/0.log" Dec 01 20:57:14 crc kubenswrapper[4802]: I1201 20:57:14.886425 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae/glance-log/0.log" Dec 01 20:57:15 crc kubenswrapper[4802]: I1201 20:57:15.112739 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-b69f75cb8-xrkks_40185112-89e4-49c3-9ccc-0b190724c5ff/horizon/0.log" Dec 01 20:57:15 crc kubenswrapper[4802]: I1201 20:57:15.116398 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-b69f75cb8-xrkks_40185112-89e4-49c3-9ccc-0b190724c5ff/horizon-log/0.log" Dec 01 20:57:15 crc kubenswrapper[4802]: I1201 20:57:15.244620 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-96mpg_189db1d5-3210-4707-b02f-8434a36a5791/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 20:57:15 crc kubenswrapper[4802]: I1201 20:57:15.354964 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-mwtvq_ba286b73-fe12-499f-b959-296217015c6b/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 20:57:15 crc kubenswrapper[4802]: I1201 20:57:15.620751 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_3358c4e8-0931-4d2e-82d6-527c54f3537c/kube-state-metrics/0.log" Dec 01 20:57:15 crc kubenswrapper[4802]: I1201 20:57:15.665122 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7977dfdfb6-dnr99_f78fa699-c933-4160-b4dc-5b3db575ac17/keystone-api/0.log" Dec 01 20:57:15 crc kubenswrapper[4802]: I1201 20:57:15.858080 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-3642-account-create-update-p2zgg_f597ab05-4236-4a1c-95cb-3ce637a2dd52/mariadb-account-create-update/0.log" Dec 01 20:57:15 crc kubenswrapper[4802]: I1201 20:57:15.887558 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc_bad07c68-596f-44ca-9580-335176bd8049/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 20:57:16 crc kubenswrapper[4802]: I1201 20:57:16.104279 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_cde0d25e-888c-44b9-95a0-3bdae318a8b0/manila-api-log/0.log" Dec 01 20:57:16 crc kubenswrapper[4802]: I1201 20:57:16.146001 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_cde0d25e-888c-44b9-95a0-3bdae318a8b0/manila-api/0.log" Dec 01 20:57:16 crc kubenswrapper[4802]: I1201 20:57:16.311666 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-db-create-mkqmt_a66b5e77-5e95-4fd0-957c-9d57f62a2238/mariadb-database-create/0.log" Dec 01 20:57:16 crc kubenswrapper[4802]: I1201 20:57:16.401170 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-db-sync-bms4z_402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c/manila-db-sync/0.log" Dec 01 20:57:16 crc kubenswrapper[4802]: I1201 20:57:16.689725 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_6998a6a9-71cf-4abd-ad6c-5e46bdae11cb/probe/0.log" Dec 01 20:57:16 crc kubenswrapper[4802]: I1201 20:57:16.731850 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_6998a6a9-71cf-4abd-ad6c-5e46bdae11cb/manila-scheduler/0.log" Dec 01 20:57:16 crc kubenswrapper[4802]: I1201 20:57:16.782593 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_54773416-92b4-406d-b8f1-c78331faa64e/manila-share/0.log" Dec 01 20:57:16 crc kubenswrapper[4802]: I1201 20:57:16.840611 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_54773416-92b4-406d-b8f1-c78331faa64e/probe/0.log" Dec 01 20:57:16 crc kubenswrapper[4802]: I1201 20:57:16.957388 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_06e3c630-6e2f-4fde-96ac-feea509e3dcb/memcached/0.log" Dec 01 20:57:17 crc kubenswrapper[4802]: I1201 20:57:17.026831 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-84c4b4b5d7-2ph8r_86540934-a020-4a27-bfa6-62fbc7cfe412/neutron-api/0.log" Dec 01 20:57:17 crc kubenswrapper[4802]: I1201 20:57:17.041385 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-84c4b4b5d7-2ph8r_86540934-a020-4a27-bfa6-62fbc7cfe412/neutron-httpd/0.log" Dec 01 20:57:17 crc kubenswrapper[4802]: I1201 20:57:17.179414 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7_2112a496-9a70-408c-99ec-211c3ba2defe/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 20:57:17 crc kubenswrapper[4802]: I1201 20:57:17.380859 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb/nova-api-log/0.log" Dec 01 20:57:17 crc kubenswrapper[4802]: I1201 20:57:17.549361 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb/nova-api-api/0.log" Dec 01 20:57:17 crc kubenswrapper[4802]: I1201 20:57:17.551867 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_f926810d-a46a-4504-9117-1584f02f386a/nova-cell0-conductor-conductor/0.log" Dec 01 20:57:17 crc kubenswrapper[4802]: I1201 20:57:17.695101 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_27620668-7b86-40fb-af4b-0c2524e097a7/nova-cell1-conductor-conductor/0.log" Dec 01 20:57:17 crc kubenswrapper[4802]: I1201 20:57:17.719624 4802 scope.go:117] "RemoveContainer" containerID="4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca" Dec 01 20:57:17 crc kubenswrapper[4802]: E1201 20:57:17.721374 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:57:17 crc kubenswrapper[4802]: I1201 20:57:17.768699 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_66cec21d-0c5f-4b29-8268-fb8f64d68bfb/nova-cell1-novncproxy-novncproxy/0.log" Dec 01 20:57:17 crc kubenswrapper[4802]: I1201 20:57:17.826885 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z_5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 20:57:17 crc kubenswrapper[4802]: I1201 20:57:17.948311 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_77a605ed-0bb9-4c8d-9a6b-86643ff44518/nova-metadata-log/0.log" Dec 01 20:57:18 crc kubenswrapper[4802]: I1201 20:57:18.169670 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_5eedf56f-d2da-4526-94fc-346c826a891d/nova-scheduler-scheduler/0.log" Dec 01 20:57:18 crc kubenswrapper[4802]: I1201 20:57:18.255530 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_00254b08-a75a-4965-8b19-f4bc8ebf6f52/mysql-bootstrap/0.log" Dec 01 20:57:18 crc kubenswrapper[4802]: I1201 20:57:18.433583 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_00254b08-a75a-4965-8b19-f4bc8ebf6f52/galera/0.log" Dec 01 20:57:18 crc kubenswrapper[4802]: I1201 20:57:18.447562 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_00254b08-a75a-4965-8b19-f4bc8ebf6f52/mysql-bootstrap/0.log" Dec 01 20:57:18 crc kubenswrapper[4802]: I1201 20:57:18.528906 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0f360e58-7047-4369-a8c8-4e0394586f62/mysql-bootstrap/0.log" Dec 01 20:57:18 crc kubenswrapper[4802]: I1201 20:57:18.692915 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0f360e58-7047-4369-a8c8-4e0394586f62/mysql-bootstrap/0.log" Dec 01 20:57:18 crc kubenswrapper[4802]: I1201 20:57:18.720325 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_77a605ed-0bb9-4c8d-9a6b-86643ff44518/nova-metadata-metadata/0.log" Dec 01 20:57:18 crc kubenswrapper[4802]: I1201 20:57:18.767053 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0f360e58-7047-4369-a8c8-4e0394586f62/galera/0.log" Dec 01 20:57:18 crc kubenswrapper[4802]: I1201 20:57:18.796722 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_19e13a2e-794d-4757-8c64-e1895a5e819d/openstackclient/0.log" Dec 01 20:57:18 crc kubenswrapper[4802]: I1201 20:57:18.912025 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-gczlr_4bbe4b6e-302e-4d6d-bc17-5f35baca1067/ovn-controller/0.log" Dec 01 20:57:18 crc kubenswrapper[4802]: I1201 20:57:18.982218 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6p8lx_dd8140ed-9737-48ea-a0ea-15003dd90986/openstack-network-exporter/0.log" Dec 01 20:57:19 crc kubenswrapper[4802]: I1201 20:57:19.108119 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j9thx_15107e62-7679-460c-ab0e-f208b4a1ec76/ovsdb-server-init/0.log" Dec 01 20:57:19 crc kubenswrapper[4802]: I1201 20:57:19.266624 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j9thx_15107e62-7679-460c-ab0e-f208b4a1ec76/ovsdb-server/0.log" Dec 01 20:57:19 crc kubenswrapper[4802]: I1201 20:57:19.272777 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j9thx_15107e62-7679-460c-ab0e-f208b4a1ec76/ovs-vswitchd/0.log" Dec 01 20:57:19 crc kubenswrapper[4802]: I1201 20:57:19.277084 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j9thx_15107e62-7679-460c-ab0e-f208b4a1ec76/ovsdb-server-init/0.log" Dec 01 20:57:19 crc kubenswrapper[4802]: I1201 20:57:19.324490 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-zpzvj_6c2f2991-db4b-4a58-807a-f3b617d9542f/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 20:57:19 crc kubenswrapper[4802]: I1201 20:57:19.479148 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a/openstack-network-exporter/0.log" Dec 01 20:57:19 crc kubenswrapper[4802]: I1201 20:57:19.490402 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a/ovn-northd/0.log" Dec 01 20:57:19 crc kubenswrapper[4802]: I1201 20:57:19.541870 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c/openstack-network-exporter/0.log" Dec 01 20:57:19 crc kubenswrapper[4802]: I1201 20:57:19.644931 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c/ovsdbserver-nb/0.log" Dec 01 20:57:19 crc kubenswrapper[4802]: I1201 20:57:19.687894 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9/openstack-network-exporter/0.log" Dec 01 20:57:19 crc kubenswrapper[4802]: I1201 20:57:19.738302 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9/ovsdbserver-sb/0.log" Dec 01 20:57:19 crc kubenswrapper[4802]: I1201 20:57:19.841140 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-646fbc85dd-2ttbm_3853723d-4452-4112-99bf-c3850a983f5d/placement-api/0.log" Dec 01 20:57:19 crc kubenswrapper[4802]: I1201 20:57:19.903831 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_d078b34a-6a2a-4ea0-b7c8-c99ff6942170/setup-container/0.log" Dec 01 20:57:19 crc kubenswrapper[4802]: I1201 20:57:19.927700 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-646fbc85dd-2ttbm_3853723d-4452-4112-99bf-c3850a983f5d/placement-log/0.log" Dec 01 20:57:20 crc kubenswrapper[4802]: I1201 20:57:20.096189 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1fe488ab-29e8-4ed4-8663-be8e88c1a7ef/setup-container/0.log" Dec 01 20:57:20 crc kubenswrapper[4802]: I1201 20:57:20.099880 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_d078b34a-6a2a-4ea0-b7c8-c99ff6942170/rabbitmq/0.log" Dec 01 20:57:20 crc kubenswrapper[4802]: I1201 20:57:20.106945 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_d078b34a-6a2a-4ea0-b7c8-c99ff6942170/setup-container/0.log" Dec 01 20:57:20 crc kubenswrapper[4802]: I1201 20:57:20.312403 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1fe488ab-29e8-4ed4-8663-be8e88c1a7ef/rabbitmq/0.log" Dec 01 20:57:20 crc kubenswrapper[4802]: I1201 20:57:20.327941 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1fe488ab-29e8-4ed4-8663-be8e88c1a7ef/setup-container/0.log" Dec 01 20:57:20 crc kubenswrapper[4802]: I1201 20:57:20.339379 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc_5daf0e64-4a00-45c3-9830-46f81436faff/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 20:57:20 crc kubenswrapper[4802]: I1201 20:57:20.490359 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp_d5a316ab-c296-4ab8-8397-00e5a017d1cc/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 20:57:20 crc kubenswrapper[4802]: I1201 20:57:20.518448 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-8jtzl_73e55b8c-927e-43fa-9104-8db3dc67fdde/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 20:57:20 crc kubenswrapper[4802]: I1201 20:57:20.575876 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-zqvg6_73a1d762-0b3b-4abf-9072-0b5bdba7bd72/ssh-known-hosts-edpm-deployment/0.log" Dec 01 20:57:20 crc kubenswrapper[4802]: I1201 20:57:20.705372 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_654db8d6-c501-48bf-bfeb-81f07e7c0e2e/tempest-tests-tempest-tests-runner/0.log" Dec 01 20:57:20 crc kubenswrapper[4802]: I1201 20:57:20.815389 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_fa5d0b95-078d-4cb3-a597-5af9283e6503/test-operator-logs-container/0.log" Dec 01 20:57:20 crc kubenswrapper[4802]: I1201 20:57:20.926174 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-vngdm_43faaae9-0df9-4e49-a5cc-2fc51e008edc/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 20:57:30 crc kubenswrapper[4802]: I1201 20:57:30.721691 4802 scope.go:117] "RemoveContainer" containerID="4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca" Dec 01 20:57:30 crc kubenswrapper[4802]: E1201 20:57:30.722413 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:57:41 crc kubenswrapper[4802]: I1201 20:57:41.189413 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w_d4928d2b-4f1e-4d3c-a858-8180343a7405/util/0.log" Dec 01 20:57:41 crc kubenswrapper[4802]: I1201 20:57:41.342874 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w_d4928d2b-4f1e-4d3c-a858-8180343a7405/util/0.log" Dec 01 20:57:41 crc kubenswrapper[4802]: I1201 20:57:41.358881 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w_d4928d2b-4f1e-4d3c-a858-8180343a7405/pull/0.log" Dec 01 20:57:41 crc kubenswrapper[4802]: I1201 20:57:41.380337 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w_d4928d2b-4f1e-4d3c-a858-8180343a7405/pull/0.log" Dec 01 20:57:41 crc kubenswrapper[4802]: I1201 20:57:41.611991 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w_d4928d2b-4f1e-4d3c-a858-8180343a7405/util/0.log" Dec 01 20:57:41 crc kubenswrapper[4802]: I1201 20:57:41.636111 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w_d4928d2b-4f1e-4d3c-a858-8180343a7405/extract/0.log" Dec 01 20:57:41 crc kubenswrapper[4802]: I1201 20:57:41.649055 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w_d4928d2b-4f1e-4d3c-a858-8180343a7405/pull/0.log" Dec 01 20:57:41 crc kubenswrapper[4802]: I1201 20:57:41.819217 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-6d8qg_18526d53-2d4c-4c40-885c-c83b3b378260/kube-rbac-proxy/0.log" Dec 01 20:57:41 crc kubenswrapper[4802]: I1201 20:57:41.871403 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-n7mjb_e067c10a-e5d4-4e57-bf14-3b0bfc8ac069/kube-rbac-proxy/0.log" Dec 01 20:57:41 crc kubenswrapper[4802]: I1201 20:57:41.913571 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-6d8qg_18526d53-2d4c-4c40-885c-c83b3b378260/manager/0.log" Dec 01 20:57:42 crc kubenswrapper[4802]: I1201 20:57:42.092846 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-n7mjb_e067c10a-e5d4-4e57-bf14-3b0bfc8ac069/manager/0.log" Dec 01 20:57:42 crc kubenswrapper[4802]: I1201 20:57:42.145725 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-qbvvz_b01ea1d5-0409-4c32-bb34-1b88253ceb05/kube-rbac-proxy/0.log" Dec 01 20:57:42 crc kubenswrapper[4802]: I1201 20:57:42.151252 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-qbvvz_b01ea1d5-0409-4c32-bb34-1b88253ceb05/manager/0.log" Dec 01 20:57:42 crc kubenswrapper[4802]: I1201 20:57:42.273709 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-776c976b46-x7bkb_eb553ce8-f696-4c6b-a745-aa1faa5f9356/kube-rbac-proxy/0.log" Dec 01 20:57:42 crc kubenswrapper[4802]: I1201 20:57:42.420686 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-776c976b46-x7bkb_eb553ce8-f696-4c6b-a745-aa1faa5f9356/manager/0.log" Dec 01 20:57:42 crc kubenswrapper[4802]: I1201 20:57:42.488682 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-5ftkv_8626cbeb-8604-4371-b936-99cab8d76742/manager/0.log" Dec 01 20:57:42 crc kubenswrapper[4802]: I1201 20:57:42.520004 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-5ftkv_8626cbeb-8604-4371-b936-99cab8d76742/kube-rbac-proxy/0.log" Dec 01 20:57:42 crc kubenswrapper[4802]: I1201 20:57:42.655681 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-kmtdj_e5c436a3-2237-4f02-a9fc-b2aae90ce3b1/kube-rbac-proxy/0.log" Dec 01 20:57:42 crc kubenswrapper[4802]: I1201 20:57:42.669191 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-kmtdj_e5c436a3-2237-4f02-a9fc-b2aae90ce3b1/manager/0.log" Dec 01 20:57:42 crc kubenswrapper[4802]: I1201 20:57:42.818931 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-jbfz7_b40759e9-9a00-445c-964e-09f1d539d85e/kube-rbac-proxy/0.log" Dec 01 20:57:42 crc kubenswrapper[4802]: I1201 20:57:42.928558 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-jcmlp_74aa06c0-a03f-4719-b751-a77ab3d472f2/kube-rbac-proxy/0.log" Dec 01 20:57:43 crc kubenswrapper[4802]: I1201 20:57:43.010101 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-jbfz7_b40759e9-9a00-445c-964e-09f1d539d85e/manager/0.log" Dec 01 20:57:43 crc kubenswrapper[4802]: I1201 20:57:43.046518 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-jcmlp_74aa06c0-a03f-4719-b751-a77ab3d472f2/manager/0.log" Dec 01 20:57:43 crc kubenswrapper[4802]: I1201 20:57:43.127374 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-2nm2t_0e1b6ed3-9b66-4279-9a9f-0685037df9c3/kube-rbac-proxy/0.log" Dec 01 20:57:43 crc kubenswrapper[4802]: I1201 20:57:43.245754 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-2nm2t_0e1b6ed3-9b66-4279-9a9f-0685037df9c3/manager/0.log" Dec 01 20:57:43 crc kubenswrapper[4802]: I1201 20:57:43.316236 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-7stkq_1891b769-8e7e-4375-b3ea-421a23fb7af4/kube-rbac-proxy/0.log" Dec 01 20:57:43 crc kubenswrapper[4802]: I1201 20:57:43.360161 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-7stkq_1891b769-8e7e-4375-b3ea-421a23fb7af4/manager/0.log" Dec 01 20:57:43 crc kubenswrapper[4802]: I1201 20:57:43.423474 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-p475v_6973effc-3f05-43cd-ba03-b9efe3b6db1d/kube-rbac-proxy/0.log" Dec 01 20:57:43 crc kubenswrapper[4802]: I1201 20:57:43.523530 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-p475v_6973effc-3f05-43cd-ba03-b9efe3b6db1d/manager/0.log" Dec 01 20:57:43 crc kubenswrapper[4802]: I1201 20:57:43.651322 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-cswfs_e58c799d-fcaa-4d9b-aa6c-c8947774bd2e/kube-rbac-proxy/0.log" Dec 01 20:57:43 crc kubenswrapper[4802]: I1201 20:57:43.693643 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-cswfs_e58c799d-fcaa-4d9b-aa6c-c8947774bd2e/manager/0.log" Dec 01 20:57:43 crc kubenswrapper[4802]: I1201 20:57:43.720775 4802 scope.go:117] "RemoveContainer" containerID="4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca" Dec 01 20:57:43 crc kubenswrapper[4802]: E1201 20:57:43.721002 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:57:43 crc kubenswrapper[4802]: I1201 20:57:43.755466 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-djvhl_c7839b31-af95-4d33-a954-9615ea0c87a6/kube-rbac-proxy/0.log" Dec 01 20:57:43 crc kubenswrapper[4802]: I1201 20:57:43.939754 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-djvhl_c7839b31-af95-4d33-a954-9615ea0c87a6/manager/0.log" Dec 01 20:57:43 crc kubenswrapper[4802]: I1201 20:57:43.945845 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-vb97q_8e5dddc5-34ff-4a71-a626-3c9cea7ef30f/kube-rbac-proxy/0.log" Dec 01 20:57:43 crc kubenswrapper[4802]: I1201 20:57:43.977726 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-vb97q_8e5dddc5-34ff-4a71-a626-3c9cea7ef30f/manager/0.log" Dec 01 20:57:44 crc kubenswrapper[4802]: I1201 20:57:44.163128 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4552rv_d12b9eb3-946b-4578-8630-4cb6643ab36f/manager/0.log" Dec 01 20:57:44 crc kubenswrapper[4802]: I1201 20:57:44.173648 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4552rv_d12b9eb3-946b-4578-8630-4cb6643ab36f/kube-rbac-proxy/0.log" Dec 01 20:57:44 crc kubenswrapper[4802]: I1201 20:57:44.621955 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-t7nmz_47078ba5-f704-4077-93af-c0afffa2070f/registry-server/0.log" Dec 01 20:57:44 crc kubenswrapper[4802]: I1201 20:57:44.627625 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-849fbcc767-rv5gz_b26cce34-e8fa-4145-a0dd-daa30dfdde81/operator/0.log" Dec 01 20:57:44 crc kubenswrapper[4802]: I1201 20:57:44.954865 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-lfh62_0934b0fd-8a48-4dee-b668-08c7b631551f/manager/0.log" Dec 01 20:57:44 crc kubenswrapper[4802]: I1201 20:57:44.982372 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-lfh62_0934b0fd-8a48-4dee-b668-08c7b631551f/kube-rbac-proxy/0.log" Dec 01 20:57:45 crc kubenswrapper[4802]: I1201 20:57:45.004249 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-45m8m_088be214-85a6-4cb1-9e02-fcde44abb492/kube-rbac-proxy/0.log" Dec 01 20:57:45 crc kubenswrapper[4802]: I1201 20:57:45.176396 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-45m8m_088be214-85a6-4cb1-9e02-fcde44abb492/manager/0.log" Dec 01 20:57:45 crc kubenswrapper[4802]: I1201 20:57:45.228962 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-mnwdn_a3350b6c-2091-4a61-a78e-5a1bcdfd11cf/operator/0.log" Dec 01 20:57:45 crc kubenswrapper[4802]: I1201 20:57:45.453867 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-trddt_bb89e7bb-899f-4f3e-80cd-833fbc74db85/kube-rbac-proxy/0.log" Dec 01 20:57:45 crc kubenswrapper[4802]: I1201 20:57:45.466674 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-trddt_bb89e7bb-899f-4f3e-80cd-833fbc74db85/manager/0.log" Dec 01 20:57:45 crc kubenswrapper[4802]: I1201 20:57:45.542938 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-8b6ht_369e7da7-22d9-470f-9ad0-48472ceffde4/kube-rbac-proxy/0.log" Dec 01 20:57:45 crc kubenswrapper[4802]: I1201 20:57:45.751810 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-wx6ct_92132c51-643c-4442-adf2-897bd2825fdf/kube-rbac-proxy/0.log" Dec 01 20:57:45 crc kubenswrapper[4802]: I1201 20:57:45.761408 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-wx6ct_92132c51-643c-4442-adf2-897bd2825fdf/manager/0.log" Dec 01 20:57:45 crc kubenswrapper[4802]: I1201 20:57:45.764057 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-8b6ht_369e7da7-22d9-470f-9ad0-48472ceffde4/manager/0.log" Dec 01 20:57:45 crc kubenswrapper[4802]: I1201 20:57:45.826034 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-79774867dd-5sjpr_aebbca29-71df-4bef-8108-66b226259a58/manager/0.log" Dec 01 20:57:45 crc kubenswrapper[4802]: I1201 20:57:45.983930 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-r89vq_97d3762b-15ce-45aa-9767-5be47c85e039/manager/0.log" Dec 01 20:57:45 crc kubenswrapper[4802]: I1201 20:57:45.991485 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-r89vq_97d3762b-15ce-45aa-9767-5be47c85e039/kube-rbac-proxy/0.log" Dec 01 20:57:54 crc kubenswrapper[4802]: I1201 20:57:54.720880 4802 scope.go:117] "RemoveContainer" containerID="4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca" Dec 01 20:57:54 crc kubenswrapper[4802]: E1201 20:57:54.721757 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:58:03 crc kubenswrapper[4802]: I1201 20:58:03.055219 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-287bb_d8d9c52e-a041-4e4c-a364-ef09f105a206/control-plane-machine-set-operator/0.log" Dec 01 20:58:03 crc kubenswrapper[4802]: I1201 20:58:03.228404 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mx65z_d2c2d1ec-c588-4247-aae2-c228404a38e0/machine-api-operator/0.log" Dec 01 20:58:03 crc kubenswrapper[4802]: I1201 20:58:03.235881 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mx65z_d2c2d1ec-c588-4247-aae2-c228404a38e0/kube-rbac-proxy/0.log" Dec 01 20:58:05 crc kubenswrapper[4802]: I1201 20:58:05.720622 4802 scope.go:117] "RemoveContainer" containerID="4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca" Dec 01 20:58:05 crc kubenswrapper[4802]: E1201 20:58:05.721885 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:58:14 crc kubenswrapper[4802]: I1201 20:58:14.883417 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-7zjnt_07ba6850-9e9e-42d2-bd61-dc97bc185119/cert-manager-controller/0.log" Dec 01 20:58:15 crc kubenswrapper[4802]: I1201 20:58:15.064886 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-r5lnb_2608cc8e-13d1-43b6-b033-1b62df0333fb/cert-manager-cainjector/0.log" Dec 01 20:58:15 crc kubenswrapper[4802]: I1201 20:58:15.107765 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-p86w8_b8339c52-f023-4f4c-9cf2-948f94a27e7a/cert-manager-webhook/0.log" Dec 01 20:58:19 crc kubenswrapper[4802]: I1201 20:58:19.719796 4802 scope.go:117] "RemoveContainer" containerID="4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca" Dec 01 20:58:19 crc kubenswrapper[4802]: E1201 20:58:19.720602 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:58:26 crc kubenswrapper[4802]: I1201 20:58:26.694029 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-tlx97_8a21a68c-7399-4632-9564-1c0650125ea5/nmstate-console-plugin/0.log" Dec 01 20:58:26 crc kubenswrapper[4802]: I1201 20:58:26.892490 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-pkwpd_df60afec-8603-441f-88bb-31d054b7fea5/nmstate-handler/0.log" Dec 01 20:58:26 crc kubenswrapper[4802]: I1201 20:58:26.897666 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-8g77d_9923b241-3a3d-4051-b5c4-6677dff519ed/kube-rbac-proxy/0.log" Dec 01 20:58:26 crc kubenswrapper[4802]: I1201 20:58:26.932136 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-8g77d_9923b241-3a3d-4051-b5c4-6677dff519ed/nmstate-metrics/0.log" Dec 01 20:58:27 crc kubenswrapper[4802]: I1201 20:58:27.175501 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-n9pjm_75a57a90-06f5-444e-897d-1191d7838e8b/nmstate-operator/0.log" Dec 01 20:58:27 crc kubenswrapper[4802]: I1201 20:58:27.179054 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-9kcsk_b504895d-c4d4-4261-ab7d-24532e127650/nmstate-webhook/0.log" Dec 01 20:58:30 crc kubenswrapper[4802]: I1201 20:58:30.721823 4802 scope.go:117] "RemoveContainer" containerID="4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca" Dec 01 20:58:30 crc kubenswrapper[4802]: E1201 20:58:30.744308 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:58:40 crc kubenswrapper[4802]: I1201 20:58:40.891209 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-x5zbd_36c33153-2c15-48db-9ab8-a52854a85093/kube-rbac-proxy/0.log" Dec 01 20:58:41 crc kubenswrapper[4802]: I1201 20:58:41.038056 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-x5zbd_36c33153-2c15-48db-9ab8-a52854a85093/controller/0.log" Dec 01 20:58:41 crc kubenswrapper[4802]: I1201 20:58:41.150624 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/cp-frr-files/0.log" Dec 01 20:58:41 crc kubenswrapper[4802]: I1201 20:58:41.365274 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/cp-frr-files/0.log" Dec 01 20:58:41 crc kubenswrapper[4802]: I1201 20:58:41.379586 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/cp-metrics/0.log" Dec 01 20:58:41 crc kubenswrapper[4802]: I1201 20:58:41.383916 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/cp-reloader/0.log" Dec 01 20:58:41 crc kubenswrapper[4802]: I1201 20:58:41.411694 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/cp-reloader/0.log" Dec 01 20:58:41 crc kubenswrapper[4802]: I1201 20:58:41.963097 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/cp-reloader/0.log" Dec 01 20:58:41 crc kubenswrapper[4802]: I1201 20:58:41.963131 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/cp-frr-files/0.log" Dec 01 20:58:41 crc kubenswrapper[4802]: I1201 20:58:41.963938 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/cp-metrics/0.log" Dec 01 20:58:42 crc kubenswrapper[4802]: I1201 20:58:42.012133 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/cp-metrics/0.log" Dec 01 20:58:42 crc kubenswrapper[4802]: I1201 20:58:42.370304 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/cp-frr-files/0.log" Dec 01 20:58:42 crc kubenswrapper[4802]: I1201 20:58:42.383270 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/cp-metrics/0.log" Dec 01 20:58:42 crc kubenswrapper[4802]: I1201 20:58:42.388828 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/cp-reloader/0.log" Dec 01 20:58:42 crc kubenswrapper[4802]: I1201 20:58:42.409229 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/controller/0.log" Dec 01 20:58:42 crc kubenswrapper[4802]: I1201 20:58:42.555453 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/frr-metrics/0.log" Dec 01 20:58:42 crc kubenswrapper[4802]: I1201 20:58:42.583704 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/kube-rbac-proxy-frr/0.log" Dec 01 20:58:42 crc kubenswrapper[4802]: I1201 20:58:42.629339 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/kube-rbac-proxy/0.log" Dec 01 20:58:42 crc kubenswrapper[4802]: I1201 20:58:42.839774 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/reloader/0.log" Dec 01 20:58:42 crc kubenswrapper[4802]: I1201 20:58:42.895067 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-5rvzc_14b6cfe2-8222-45da-808e-2a3d64d13b94/frr-k8s-webhook-server/0.log" Dec 01 20:58:43 crc kubenswrapper[4802]: I1201 20:58:43.027421 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-79cd97f75d-bjkc2_27545afc-bda6-468c-b9c2-8ab3182546c8/manager/0.log" Dec 01 20:58:43 crc kubenswrapper[4802]: I1201 20:58:43.888476 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-d94989d67-mpm9p_0ca0e887-c648-46c5-941a-96fc3a8e551e/webhook-server/0.log" Dec 01 20:58:43 crc kubenswrapper[4802]: I1201 20:58:43.912132 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zjgml_4c77b924-7d9d-48b6-9e00-476f7df7104c/kube-rbac-proxy/0.log" Dec 01 20:58:43 crc kubenswrapper[4802]: I1201 20:58:43.991631 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/frr/0.log" Dec 01 20:58:44 crc kubenswrapper[4802]: I1201 20:58:44.364733 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zjgml_4c77b924-7d9d-48b6-9e00-476f7df7104c/speaker/0.log" Dec 01 20:58:44 crc kubenswrapper[4802]: I1201 20:58:44.720421 4802 scope.go:117] "RemoveContainer" containerID="4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca" Dec 01 20:58:44 crc kubenswrapper[4802]: E1201 20:58:44.720657 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:58:57 crc kubenswrapper[4802]: I1201 20:58:57.719992 4802 scope.go:117] "RemoveContainer" containerID="4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca" Dec 01 20:58:57 crc kubenswrapper[4802]: E1201 20:58:57.720747 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:58:57 crc kubenswrapper[4802]: I1201 20:58:57.812165 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7_cbe6c903-91e9-4c8d-b378-4be5220c8ab0/util/0.log" Dec 01 20:58:57 crc kubenswrapper[4802]: I1201 20:58:57.988123 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7_cbe6c903-91e9-4c8d-b378-4be5220c8ab0/pull/0.log" Dec 01 20:58:58 crc kubenswrapper[4802]: I1201 20:58:58.002451 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7_cbe6c903-91e9-4c8d-b378-4be5220c8ab0/util/0.log" Dec 01 20:58:58 crc kubenswrapper[4802]: I1201 20:58:58.039781 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7_cbe6c903-91e9-4c8d-b378-4be5220c8ab0/pull/0.log" Dec 01 20:58:58 crc kubenswrapper[4802]: I1201 20:58:58.202303 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7_cbe6c903-91e9-4c8d-b378-4be5220c8ab0/util/0.log" Dec 01 20:58:58 crc kubenswrapper[4802]: I1201 20:58:58.230850 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7_cbe6c903-91e9-4c8d-b378-4be5220c8ab0/extract/0.log" Dec 01 20:58:58 crc kubenswrapper[4802]: I1201 20:58:58.242551 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7_cbe6c903-91e9-4c8d-b378-4be5220c8ab0/pull/0.log" Dec 01 20:58:58 crc kubenswrapper[4802]: I1201 20:58:58.395263 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s_f0139070-b593-45e2-9098-969b27b2be38/util/0.log" Dec 01 20:58:58 crc kubenswrapper[4802]: I1201 20:58:58.543948 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s_f0139070-b593-45e2-9098-969b27b2be38/pull/0.log" Dec 01 20:58:58 crc kubenswrapper[4802]: I1201 20:58:58.549293 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s_f0139070-b593-45e2-9098-969b27b2be38/util/0.log" Dec 01 20:58:58 crc kubenswrapper[4802]: I1201 20:58:58.609563 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s_f0139070-b593-45e2-9098-969b27b2be38/pull/0.log" Dec 01 20:58:58 crc kubenswrapper[4802]: I1201 20:58:58.739507 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s_f0139070-b593-45e2-9098-969b27b2be38/util/0.log" Dec 01 20:58:58 crc kubenswrapper[4802]: I1201 20:58:58.747319 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s_f0139070-b593-45e2-9098-969b27b2be38/pull/0.log" Dec 01 20:58:58 crc kubenswrapper[4802]: I1201 20:58:58.773887 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s_f0139070-b593-45e2-9098-969b27b2be38/extract/0.log" Dec 01 20:58:58 crc kubenswrapper[4802]: I1201 20:58:58.932012 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q86qm_a3b32805-cd2f-493c-a2be-2d993678cd06/extract-utilities/0.log" Dec 01 20:58:59 crc kubenswrapper[4802]: I1201 20:58:59.071832 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q86qm_a3b32805-cd2f-493c-a2be-2d993678cd06/extract-utilities/0.log" Dec 01 20:58:59 crc kubenswrapper[4802]: I1201 20:58:59.095737 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q86qm_a3b32805-cd2f-493c-a2be-2d993678cd06/extract-content/0.log" Dec 01 20:58:59 crc kubenswrapper[4802]: I1201 20:58:59.110291 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q86qm_a3b32805-cd2f-493c-a2be-2d993678cd06/extract-content/0.log" Dec 01 20:58:59 crc kubenswrapper[4802]: I1201 20:58:59.291187 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q86qm_a3b32805-cd2f-493c-a2be-2d993678cd06/extract-content/0.log" Dec 01 20:58:59 crc kubenswrapper[4802]: I1201 20:58:59.302572 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q86qm_a3b32805-cd2f-493c-a2be-2d993678cd06/extract-utilities/0.log" Dec 01 20:58:59 crc kubenswrapper[4802]: I1201 20:58:59.520789 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wcdzr_4f8205a1-38e0-47e3-be8f-5add6cdac5cb/extract-utilities/0.log" Dec 01 20:58:59 crc kubenswrapper[4802]: I1201 20:58:59.813464 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wcdzr_4f8205a1-38e0-47e3-be8f-5add6cdac5cb/extract-content/0.log" Dec 01 20:58:59 crc kubenswrapper[4802]: I1201 20:58:59.923706 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wcdzr_4f8205a1-38e0-47e3-be8f-5add6cdac5cb/extract-utilities/0.log" Dec 01 20:58:59 crc kubenswrapper[4802]: I1201 20:58:59.933841 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q86qm_a3b32805-cd2f-493c-a2be-2d993678cd06/registry-server/0.log" Dec 01 20:58:59 crc kubenswrapper[4802]: I1201 20:58:59.934148 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wcdzr_4f8205a1-38e0-47e3-be8f-5add6cdac5cb/extract-content/0.log" Dec 01 20:59:00 crc kubenswrapper[4802]: I1201 20:59:00.112831 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wcdzr_4f8205a1-38e0-47e3-be8f-5add6cdac5cb/extract-content/0.log" Dec 01 20:59:00 crc kubenswrapper[4802]: I1201 20:59:00.118355 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wcdzr_4f8205a1-38e0-47e3-be8f-5add6cdac5cb/extract-utilities/0.log" Dec 01 20:59:00 crc kubenswrapper[4802]: I1201 20:59:00.355939 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-fpwgt_4fffad75-c42a-40d4-a2f3-d770091b01fa/marketplace-operator/0.log" Dec 01 20:59:00 crc kubenswrapper[4802]: I1201 20:59:00.360063 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fgxlr_038eb692-9117-4953-94a0-420941ce4b7a/extract-utilities/0.log" Dec 01 20:59:00 crc kubenswrapper[4802]: I1201 20:59:00.595221 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wcdzr_4f8205a1-38e0-47e3-be8f-5add6cdac5cb/registry-server/0.log" Dec 01 20:59:00 crc kubenswrapper[4802]: I1201 20:59:00.636528 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fgxlr_038eb692-9117-4953-94a0-420941ce4b7a/extract-content/0.log" Dec 01 20:59:00 crc kubenswrapper[4802]: I1201 20:59:00.650750 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fgxlr_038eb692-9117-4953-94a0-420941ce4b7a/extract-utilities/0.log" Dec 01 20:59:00 crc kubenswrapper[4802]: I1201 20:59:00.701914 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fgxlr_038eb692-9117-4953-94a0-420941ce4b7a/extract-content/0.log" Dec 01 20:59:00 crc kubenswrapper[4802]: I1201 20:59:00.891807 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fgxlr_038eb692-9117-4953-94a0-420941ce4b7a/extract-content/0.log" Dec 01 20:59:00 crc kubenswrapper[4802]: I1201 20:59:00.901313 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fgxlr_038eb692-9117-4953-94a0-420941ce4b7a/extract-utilities/0.log" Dec 01 20:59:01 crc kubenswrapper[4802]: I1201 20:59:01.011561 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fgxlr_038eb692-9117-4953-94a0-420941ce4b7a/registry-server/0.log" Dec 01 20:59:01 crc kubenswrapper[4802]: I1201 20:59:01.321847 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hhc44_3118d07d-4c45-4759-b2a0-792e5f4ca0fc/extract-utilities/0.log" Dec 01 20:59:01 crc kubenswrapper[4802]: I1201 20:59:01.649547 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hhc44_3118d07d-4c45-4759-b2a0-792e5f4ca0fc/extract-content/0.log" Dec 01 20:59:01 crc kubenswrapper[4802]: I1201 20:59:01.649761 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hhc44_3118d07d-4c45-4759-b2a0-792e5f4ca0fc/extract-content/0.log" Dec 01 20:59:01 crc kubenswrapper[4802]: I1201 20:59:01.681942 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hhc44_3118d07d-4c45-4759-b2a0-792e5f4ca0fc/extract-utilities/0.log" Dec 01 20:59:01 crc kubenswrapper[4802]: I1201 20:59:01.865627 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hhc44_3118d07d-4c45-4759-b2a0-792e5f4ca0fc/extract-utilities/0.log" Dec 01 20:59:01 crc kubenswrapper[4802]: I1201 20:59:01.869914 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hhc44_3118d07d-4c45-4759-b2a0-792e5f4ca0fc/extract-content/0.log" Dec 01 20:59:02 crc kubenswrapper[4802]: I1201 20:59:02.332682 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hhc44_3118d07d-4c45-4759-b2a0-792e5f4ca0fc/registry-server/0.log" Dec 01 20:59:06 crc kubenswrapper[4802]: I1201 20:59:06.586933 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mf5g7"] Dec 01 20:59:06 crc kubenswrapper[4802]: E1201 20:59:06.588813 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de78ca73-365f-4dbd-8d2e-816d89eb6d79" containerName="extract-content" Dec 01 20:59:06 crc kubenswrapper[4802]: I1201 20:59:06.588916 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="de78ca73-365f-4dbd-8d2e-816d89eb6d79" containerName="extract-content" Dec 01 20:59:06 crc kubenswrapper[4802]: E1201 20:59:06.588985 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e66e0249-9adc-4cb7-b1ed-c328fd1d640b" containerName="extract-utilities" Dec 01 20:59:06 crc kubenswrapper[4802]: I1201 20:59:06.589050 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="e66e0249-9adc-4cb7-b1ed-c328fd1d640b" containerName="extract-utilities" Dec 01 20:59:06 crc kubenswrapper[4802]: E1201 20:59:06.589118 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de78ca73-365f-4dbd-8d2e-816d89eb6d79" containerName="extract-utilities" Dec 01 20:59:06 crc kubenswrapper[4802]: I1201 20:59:06.589213 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="de78ca73-365f-4dbd-8d2e-816d89eb6d79" containerName="extract-utilities" Dec 01 20:59:06 crc kubenswrapper[4802]: E1201 20:59:06.589318 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de78ca73-365f-4dbd-8d2e-816d89eb6d79" containerName="registry-server" Dec 01 20:59:06 crc kubenswrapper[4802]: I1201 20:59:06.589379 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="de78ca73-365f-4dbd-8d2e-816d89eb6d79" containerName="registry-server" Dec 01 20:59:06 crc kubenswrapper[4802]: E1201 20:59:06.589436 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e66e0249-9adc-4cb7-b1ed-c328fd1d640b" containerName="registry-server" Dec 01 20:59:06 crc kubenswrapper[4802]: I1201 20:59:06.589488 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="e66e0249-9adc-4cb7-b1ed-c328fd1d640b" containerName="registry-server" Dec 01 20:59:06 crc kubenswrapper[4802]: E1201 20:59:06.589553 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e66e0249-9adc-4cb7-b1ed-c328fd1d640b" containerName="extract-content" Dec 01 20:59:06 crc kubenswrapper[4802]: I1201 20:59:06.589615 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="e66e0249-9adc-4cb7-b1ed-c328fd1d640b" containerName="extract-content" Dec 01 20:59:06 crc kubenswrapper[4802]: I1201 20:59:06.589915 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="e66e0249-9adc-4cb7-b1ed-c328fd1d640b" containerName="registry-server" Dec 01 20:59:06 crc kubenswrapper[4802]: I1201 20:59:06.590016 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="de78ca73-365f-4dbd-8d2e-816d89eb6d79" containerName="registry-server" Dec 01 20:59:06 crc kubenswrapper[4802]: I1201 20:59:06.591535 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mf5g7" Dec 01 20:59:06 crc kubenswrapper[4802]: I1201 20:59:06.602333 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mf5g7"] Dec 01 20:59:06 crc kubenswrapper[4802]: I1201 20:59:06.668456 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e76b9251-7edd-411c-8957-9eb88cc69aa2-catalog-content\") pod \"community-operators-mf5g7\" (UID: \"e76b9251-7edd-411c-8957-9eb88cc69aa2\") " pod="openshift-marketplace/community-operators-mf5g7" Dec 01 20:59:06 crc kubenswrapper[4802]: I1201 20:59:06.668756 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e76b9251-7edd-411c-8957-9eb88cc69aa2-utilities\") pod \"community-operators-mf5g7\" (UID: \"e76b9251-7edd-411c-8957-9eb88cc69aa2\") " pod="openshift-marketplace/community-operators-mf5g7" Dec 01 20:59:06 crc kubenswrapper[4802]: I1201 20:59:06.668971 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v8ck\" (UniqueName: \"kubernetes.io/projected/e76b9251-7edd-411c-8957-9eb88cc69aa2-kube-api-access-5v8ck\") pod \"community-operators-mf5g7\" (UID: \"e76b9251-7edd-411c-8957-9eb88cc69aa2\") " pod="openshift-marketplace/community-operators-mf5g7" Dec 01 20:59:06 crc kubenswrapper[4802]: I1201 20:59:06.771035 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e76b9251-7edd-411c-8957-9eb88cc69aa2-utilities\") pod \"community-operators-mf5g7\" (UID: \"e76b9251-7edd-411c-8957-9eb88cc69aa2\") " pod="openshift-marketplace/community-operators-mf5g7" Dec 01 20:59:06 crc kubenswrapper[4802]: I1201 20:59:06.771210 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v8ck\" (UniqueName: \"kubernetes.io/projected/e76b9251-7edd-411c-8957-9eb88cc69aa2-kube-api-access-5v8ck\") pod \"community-operators-mf5g7\" (UID: \"e76b9251-7edd-411c-8957-9eb88cc69aa2\") " pod="openshift-marketplace/community-operators-mf5g7" Dec 01 20:59:06 crc kubenswrapper[4802]: I1201 20:59:06.771303 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e76b9251-7edd-411c-8957-9eb88cc69aa2-catalog-content\") pod \"community-operators-mf5g7\" (UID: \"e76b9251-7edd-411c-8957-9eb88cc69aa2\") " pod="openshift-marketplace/community-operators-mf5g7" Dec 01 20:59:06 crc kubenswrapper[4802]: I1201 20:59:06.771631 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e76b9251-7edd-411c-8957-9eb88cc69aa2-utilities\") pod \"community-operators-mf5g7\" (UID: \"e76b9251-7edd-411c-8957-9eb88cc69aa2\") " pod="openshift-marketplace/community-operators-mf5g7" Dec 01 20:59:06 crc kubenswrapper[4802]: I1201 20:59:06.771689 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e76b9251-7edd-411c-8957-9eb88cc69aa2-catalog-content\") pod \"community-operators-mf5g7\" (UID: \"e76b9251-7edd-411c-8957-9eb88cc69aa2\") " pod="openshift-marketplace/community-operators-mf5g7" Dec 01 20:59:06 crc kubenswrapper[4802]: I1201 20:59:06.792622 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v8ck\" (UniqueName: \"kubernetes.io/projected/e76b9251-7edd-411c-8957-9eb88cc69aa2-kube-api-access-5v8ck\") pod \"community-operators-mf5g7\" (UID: \"e76b9251-7edd-411c-8957-9eb88cc69aa2\") " pod="openshift-marketplace/community-operators-mf5g7" Dec 01 20:59:06 crc kubenswrapper[4802]: I1201 20:59:06.924130 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mf5g7" Dec 01 20:59:07 crc kubenswrapper[4802]: I1201 20:59:07.531706 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mf5g7"] Dec 01 20:59:07 crc kubenswrapper[4802]: I1201 20:59:07.867855 4802 generic.go:334] "Generic (PLEG): container finished" podID="e76b9251-7edd-411c-8957-9eb88cc69aa2" containerID="4e18a31b0a2a6a3bdb6a795c873831bd770c5eb707ef1c18b7448a285237f634" exitCode=0 Dec 01 20:59:07 crc kubenswrapper[4802]: I1201 20:59:07.867903 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mf5g7" event={"ID":"e76b9251-7edd-411c-8957-9eb88cc69aa2","Type":"ContainerDied","Data":"4e18a31b0a2a6a3bdb6a795c873831bd770c5eb707ef1c18b7448a285237f634"} Dec 01 20:59:07 crc kubenswrapper[4802]: I1201 20:59:07.867953 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mf5g7" event={"ID":"e76b9251-7edd-411c-8957-9eb88cc69aa2","Type":"ContainerStarted","Data":"1b5517c142a38c56ca4535c11b7400fe16b8e61398d9d6734a4c010cc9619915"} Dec 01 20:59:08 crc kubenswrapper[4802]: I1201 20:59:08.878698 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mf5g7" event={"ID":"e76b9251-7edd-411c-8957-9eb88cc69aa2","Type":"ContainerStarted","Data":"f248abdd8fbcf01d0d8465de3be4f420729d604183af8cf669aa0ae052b59570"} Dec 01 20:59:09 crc kubenswrapper[4802]: I1201 20:59:09.889163 4802 generic.go:334] "Generic (PLEG): container finished" podID="e76b9251-7edd-411c-8957-9eb88cc69aa2" containerID="f248abdd8fbcf01d0d8465de3be4f420729d604183af8cf669aa0ae052b59570" exitCode=0 Dec 01 20:59:09 crc kubenswrapper[4802]: I1201 20:59:09.889235 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mf5g7" event={"ID":"e76b9251-7edd-411c-8957-9eb88cc69aa2","Type":"ContainerDied","Data":"f248abdd8fbcf01d0d8465de3be4f420729d604183af8cf669aa0ae052b59570"} Dec 01 20:59:10 crc kubenswrapper[4802]: I1201 20:59:10.903144 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mf5g7" event={"ID":"e76b9251-7edd-411c-8957-9eb88cc69aa2","Type":"ContainerStarted","Data":"e55c26958f2e0870ea597a0bc6c2f409e9029e1c6b95b078d350525917eef443"} Dec 01 20:59:10 crc kubenswrapper[4802]: I1201 20:59:10.932253 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mf5g7" podStartSLOduration=2.391195295 podStartE2EDuration="4.932238162s" podCreationTimestamp="2025-12-01 20:59:06 +0000 UTC" firstStartedPulling="2025-12-01 20:59:07.869965813 +0000 UTC m=+3769.432525454" lastFinishedPulling="2025-12-01 20:59:10.41100868 +0000 UTC m=+3771.973568321" observedRunningTime="2025-12-01 20:59:10.92894095 +0000 UTC m=+3772.491500601" watchObservedRunningTime="2025-12-01 20:59:10.932238162 +0000 UTC m=+3772.494797793" Dec 01 20:59:11 crc kubenswrapper[4802]: I1201 20:59:11.720603 4802 scope.go:117] "RemoveContainer" containerID="4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca" Dec 01 20:59:11 crc kubenswrapper[4802]: E1201 20:59:11.721126 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:59:13 crc kubenswrapper[4802]: I1201 20:59:13.050549 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-3642-account-create-update-p2zgg"] Dec 01 20:59:13 crc kubenswrapper[4802]: I1201 20:59:13.062847 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-mkqmt"] Dec 01 20:59:13 crc kubenswrapper[4802]: I1201 20:59:13.081474 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-3642-account-create-update-p2zgg"] Dec 01 20:59:13 crc kubenswrapper[4802]: I1201 20:59:13.092288 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-mkqmt"] Dec 01 20:59:14 crc kubenswrapper[4802]: I1201 20:59:14.732530 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a66b5e77-5e95-4fd0-957c-9d57f62a2238" path="/var/lib/kubelet/pods/a66b5e77-5e95-4fd0-957c-9d57f62a2238/volumes" Dec 01 20:59:14 crc kubenswrapper[4802]: I1201 20:59:14.734549 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f597ab05-4236-4a1c-95cb-3ce637a2dd52" path="/var/lib/kubelet/pods/f597ab05-4236-4a1c-95cb-3ce637a2dd52/volumes" Dec 01 20:59:16 crc kubenswrapper[4802]: I1201 20:59:16.924618 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mf5g7" Dec 01 20:59:16 crc kubenswrapper[4802]: I1201 20:59:16.924989 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mf5g7" Dec 01 20:59:16 crc kubenswrapper[4802]: I1201 20:59:16.980708 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mf5g7" Dec 01 20:59:17 crc kubenswrapper[4802]: I1201 20:59:17.032506 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mf5g7" Dec 01 20:59:17 crc kubenswrapper[4802]: I1201 20:59:17.218969 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mf5g7"] Dec 01 20:59:18 crc kubenswrapper[4802]: I1201 20:59:18.969419 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mf5g7" podUID="e76b9251-7edd-411c-8957-9eb88cc69aa2" containerName="registry-server" containerID="cri-o://e55c26958f2e0870ea597a0bc6c2f409e9029e1c6b95b078d350525917eef443" gracePeriod=2 Dec 01 20:59:19 crc kubenswrapper[4802]: E1201 20:59:19.259311 4802 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode76b9251_7edd_411c_8957_9eb88cc69aa2.slice/crio-conmon-e55c26958f2e0870ea597a0bc6c2f409e9029e1c6b95b078d350525917eef443.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode76b9251_7edd_411c_8957_9eb88cc69aa2.slice/crio-e55c26958f2e0870ea597a0bc6c2f409e9029e1c6b95b078d350525917eef443.scope\": RecentStats: unable to find data in memory cache]" Dec 01 20:59:19 crc kubenswrapper[4802]: I1201 20:59:19.989596 4802 generic.go:334] "Generic (PLEG): container finished" podID="e76b9251-7edd-411c-8957-9eb88cc69aa2" containerID="e55c26958f2e0870ea597a0bc6c2f409e9029e1c6b95b078d350525917eef443" exitCode=0 Dec 01 20:59:19 crc kubenswrapper[4802]: I1201 20:59:19.989837 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mf5g7" event={"ID":"e76b9251-7edd-411c-8957-9eb88cc69aa2","Type":"ContainerDied","Data":"e55c26958f2e0870ea597a0bc6c2f409e9029e1c6b95b078d350525917eef443"} Dec 01 20:59:19 crc kubenswrapper[4802]: I1201 20:59:19.989950 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mf5g7" event={"ID":"e76b9251-7edd-411c-8957-9eb88cc69aa2","Type":"ContainerDied","Data":"1b5517c142a38c56ca4535c11b7400fe16b8e61398d9d6734a4c010cc9619915"} Dec 01 20:59:19 crc kubenswrapper[4802]: I1201 20:59:19.989973 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b5517c142a38c56ca4535c11b7400fe16b8e61398d9d6734a4c010cc9619915" Dec 01 20:59:20 crc kubenswrapper[4802]: I1201 20:59:20.018389 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mf5g7" Dec 01 20:59:20 crc kubenswrapper[4802]: I1201 20:59:20.139535 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e76b9251-7edd-411c-8957-9eb88cc69aa2-utilities\") pod \"e76b9251-7edd-411c-8957-9eb88cc69aa2\" (UID: \"e76b9251-7edd-411c-8957-9eb88cc69aa2\") " Dec 01 20:59:20 crc kubenswrapper[4802]: I1201 20:59:20.139614 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e76b9251-7edd-411c-8957-9eb88cc69aa2-catalog-content\") pod \"e76b9251-7edd-411c-8957-9eb88cc69aa2\" (UID: \"e76b9251-7edd-411c-8957-9eb88cc69aa2\") " Dec 01 20:59:20 crc kubenswrapper[4802]: I1201 20:59:20.139729 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5v8ck\" (UniqueName: \"kubernetes.io/projected/e76b9251-7edd-411c-8957-9eb88cc69aa2-kube-api-access-5v8ck\") pod \"e76b9251-7edd-411c-8957-9eb88cc69aa2\" (UID: \"e76b9251-7edd-411c-8957-9eb88cc69aa2\") " Dec 01 20:59:20 crc kubenswrapper[4802]: I1201 20:59:20.140504 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e76b9251-7edd-411c-8957-9eb88cc69aa2-utilities" (OuterVolumeSpecName: "utilities") pod "e76b9251-7edd-411c-8957-9eb88cc69aa2" (UID: "e76b9251-7edd-411c-8957-9eb88cc69aa2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:59:20 crc kubenswrapper[4802]: I1201 20:59:20.146403 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e76b9251-7edd-411c-8957-9eb88cc69aa2-kube-api-access-5v8ck" (OuterVolumeSpecName: "kube-api-access-5v8ck") pod "e76b9251-7edd-411c-8957-9eb88cc69aa2" (UID: "e76b9251-7edd-411c-8957-9eb88cc69aa2"). InnerVolumeSpecName "kube-api-access-5v8ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 20:59:20 crc kubenswrapper[4802]: I1201 20:59:20.196374 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e76b9251-7edd-411c-8957-9eb88cc69aa2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e76b9251-7edd-411c-8957-9eb88cc69aa2" (UID: "e76b9251-7edd-411c-8957-9eb88cc69aa2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 20:59:20 crc kubenswrapper[4802]: I1201 20:59:20.242307 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e76b9251-7edd-411c-8957-9eb88cc69aa2-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 20:59:20 crc kubenswrapper[4802]: I1201 20:59:20.242342 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e76b9251-7edd-411c-8957-9eb88cc69aa2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 20:59:20 crc kubenswrapper[4802]: I1201 20:59:20.242352 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5v8ck\" (UniqueName: \"kubernetes.io/projected/e76b9251-7edd-411c-8957-9eb88cc69aa2-kube-api-access-5v8ck\") on node \"crc\" DevicePath \"\"" Dec 01 20:59:20 crc kubenswrapper[4802]: I1201 20:59:20.997522 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mf5g7" Dec 01 20:59:21 crc kubenswrapper[4802]: I1201 20:59:21.021579 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mf5g7"] Dec 01 20:59:21 crc kubenswrapper[4802]: I1201 20:59:21.029610 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mf5g7"] Dec 01 20:59:22 crc kubenswrapper[4802]: I1201 20:59:22.732369 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e76b9251-7edd-411c-8957-9eb88cc69aa2" path="/var/lib/kubelet/pods/e76b9251-7edd-411c-8957-9eb88cc69aa2/volumes" Dec 01 20:59:26 crc kubenswrapper[4802]: I1201 20:59:26.735956 4802 scope.go:117] "RemoveContainer" containerID="4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca" Dec 01 20:59:26 crc kubenswrapper[4802]: E1201 20:59:26.742571 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:59:28 crc kubenswrapper[4802]: I1201 20:59:28.351663 4802 scope.go:117] "RemoveContainer" containerID="70cf64eabf7d4d951d30023add8f6fcf45777ed5ddaccbe3a07ee94762c2e238" Dec 01 20:59:28 crc kubenswrapper[4802]: I1201 20:59:28.663271 4802 scope.go:117] "RemoveContainer" containerID="190ff2633167971da5c1baa380e299005a4c805ee6f539edfbe9697d88ec2773" Dec 01 20:59:37 crc kubenswrapper[4802]: E1201 20:59:37.679280 4802 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.151:41764->38.102.83.151:34181: write tcp 38.102.83.151:41764->38.102.83.151:34181: write: broken pipe Dec 01 20:59:41 crc kubenswrapper[4802]: I1201 20:59:41.050795 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-bms4z"] Dec 01 20:59:41 crc kubenswrapper[4802]: I1201 20:59:41.058774 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-bms4z"] Dec 01 20:59:41 crc kubenswrapper[4802]: I1201 20:59:41.720899 4802 scope.go:117] "RemoveContainer" containerID="4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca" Dec 01 20:59:41 crc kubenswrapper[4802]: E1201 20:59:41.721526 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 20:59:42 crc kubenswrapper[4802]: I1201 20:59:42.743700 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c" path="/var/lib/kubelet/pods/402ca7c9-4910-43ff-b2f0-0e1c3d35ff7c/volumes" Dec 01 20:59:55 crc kubenswrapper[4802]: I1201 20:59:55.719798 4802 scope.go:117] "RemoveContainer" containerID="4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca" Dec 01 20:59:55 crc kubenswrapper[4802]: E1201 20:59:55.720702 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 21:00:00 crc kubenswrapper[4802]: I1201 21:00:00.161184 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410380-bbx68"] Dec 01 21:00:00 crc kubenswrapper[4802]: E1201 21:00:00.162270 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e76b9251-7edd-411c-8957-9eb88cc69aa2" containerName="extract-content" Dec 01 21:00:00 crc kubenswrapper[4802]: I1201 21:00:00.162308 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="e76b9251-7edd-411c-8957-9eb88cc69aa2" containerName="extract-content" Dec 01 21:00:00 crc kubenswrapper[4802]: E1201 21:00:00.162332 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e76b9251-7edd-411c-8957-9eb88cc69aa2" containerName="registry-server" Dec 01 21:00:00 crc kubenswrapper[4802]: I1201 21:00:00.162344 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="e76b9251-7edd-411c-8957-9eb88cc69aa2" containerName="registry-server" Dec 01 21:00:00 crc kubenswrapper[4802]: E1201 21:00:00.162395 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e76b9251-7edd-411c-8957-9eb88cc69aa2" containerName="extract-utilities" Dec 01 21:00:00 crc kubenswrapper[4802]: I1201 21:00:00.162407 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="e76b9251-7edd-411c-8957-9eb88cc69aa2" containerName="extract-utilities" Dec 01 21:00:00 crc kubenswrapper[4802]: I1201 21:00:00.162673 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="e76b9251-7edd-411c-8957-9eb88cc69aa2" containerName="registry-server" Dec 01 21:00:00 crc kubenswrapper[4802]: I1201 21:00:00.163572 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410380-bbx68" Dec 01 21:00:00 crc kubenswrapper[4802]: I1201 21:00:00.165609 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 21:00:00 crc kubenswrapper[4802]: I1201 21:00:00.165857 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 21:00:00 crc kubenswrapper[4802]: I1201 21:00:00.178419 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410380-bbx68"] Dec 01 21:00:00 crc kubenswrapper[4802]: I1201 21:00:00.213501 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a4478f6e-40e3-4160-be55-f1de6ebf5f63-secret-volume\") pod \"collect-profiles-29410380-bbx68\" (UID: \"a4478f6e-40e3-4160-be55-f1de6ebf5f63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410380-bbx68" Dec 01 21:00:00 crc kubenswrapper[4802]: I1201 21:00:00.213620 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4478f6e-40e3-4160-be55-f1de6ebf5f63-config-volume\") pod \"collect-profiles-29410380-bbx68\" (UID: \"a4478f6e-40e3-4160-be55-f1de6ebf5f63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410380-bbx68" Dec 01 21:00:00 crc kubenswrapper[4802]: I1201 21:00:00.213749 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbmtk\" (UniqueName: \"kubernetes.io/projected/a4478f6e-40e3-4160-be55-f1de6ebf5f63-kube-api-access-wbmtk\") pod \"collect-profiles-29410380-bbx68\" (UID: \"a4478f6e-40e3-4160-be55-f1de6ebf5f63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410380-bbx68" Dec 01 21:00:00 crc kubenswrapper[4802]: I1201 21:00:00.315320 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a4478f6e-40e3-4160-be55-f1de6ebf5f63-secret-volume\") pod \"collect-profiles-29410380-bbx68\" (UID: \"a4478f6e-40e3-4160-be55-f1de6ebf5f63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410380-bbx68" Dec 01 21:00:00 crc kubenswrapper[4802]: I1201 21:00:00.315392 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4478f6e-40e3-4160-be55-f1de6ebf5f63-config-volume\") pod \"collect-profiles-29410380-bbx68\" (UID: \"a4478f6e-40e3-4160-be55-f1de6ebf5f63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410380-bbx68" Dec 01 21:00:00 crc kubenswrapper[4802]: I1201 21:00:00.315442 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbmtk\" (UniqueName: \"kubernetes.io/projected/a4478f6e-40e3-4160-be55-f1de6ebf5f63-kube-api-access-wbmtk\") pod \"collect-profiles-29410380-bbx68\" (UID: \"a4478f6e-40e3-4160-be55-f1de6ebf5f63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410380-bbx68" Dec 01 21:00:00 crc kubenswrapper[4802]: I1201 21:00:00.316674 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4478f6e-40e3-4160-be55-f1de6ebf5f63-config-volume\") pod \"collect-profiles-29410380-bbx68\" (UID: \"a4478f6e-40e3-4160-be55-f1de6ebf5f63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410380-bbx68" Dec 01 21:00:00 crc kubenswrapper[4802]: I1201 21:00:00.335250 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a4478f6e-40e3-4160-be55-f1de6ebf5f63-secret-volume\") pod \"collect-profiles-29410380-bbx68\" (UID: \"a4478f6e-40e3-4160-be55-f1de6ebf5f63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410380-bbx68" Dec 01 21:00:00 crc kubenswrapper[4802]: I1201 21:00:00.337713 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbmtk\" (UniqueName: \"kubernetes.io/projected/a4478f6e-40e3-4160-be55-f1de6ebf5f63-kube-api-access-wbmtk\") pod \"collect-profiles-29410380-bbx68\" (UID: \"a4478f6e-40e3-4160-be55-f1de6ebf5f63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410380-bbx68" Dec 01 21:00:00 crc kubenswrapper[4802]: I1201 21:00:00.487747 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410380-bbx68" Dec 01 21:00:00 crc kubenswrapper[4802]: I1201 21:00:00.962361 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410380-bbx68"] Dec 01 21:00:01 crc kubenswrapper[4802]: I1201 21:00:01.435109 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410380-bbx68" event={"ID":"a4478f6e-40e3-4160-be55-f1de6ebf5f63","Type":"ContainerStarted","Data":"7dd3ec7b66f9be7bdcc8454402ec36138484cf4de558b737aba33221230c52fa"} Dec 01 21:00:02 crc kubenswrapper[4802]: I1201 21:00:02.445184 4802 generic.go:334] "Generic (PLEG): container finished" podID="a4478f6e-40e3-4160-be55-f1de6ebf5f63" containerID="303bf2412b8033673e74d57d6280f6e2e9660f4a0488c4e7c2a3698d65a82a53" exitCode=0 Dec 01 21:00:02 crc kubenswrapper[4802]: I1201 21:00:02.445248 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410380-bbx68" event={"ID":"a4478f6e-40e3-4160-be55-f1de6ebf5f63","Type":"ContainerDied","Data":"303bf2412b8033673e74d57d6280f6e2e9660f4a0488c4e7c2a3698d65a82a53"} Dec 01 21:00:03 crc kubenswrapper[4802]: I1201 21:00:03.864168 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410380-bbx68" Dec 01 21:00:03 crc kubenswrapper[4802]: I1201 21:00:03.988575 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a4478f6e-40e3-4160-be55-f1de6ebf5f63-secret-volume\") pod \"a4478f6e-40e3-4160-be55-f1de6ebf5f63\" (UID: \"a4478f6e-40e3-4160-be55-f1de6ebf5f63\") " Dec 01 21:00:03 crc kubenswrapper[4802]: I1201 21:00:03.988678 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4478f6e-40e3-4160-be55-f1de6ebf5f63-config-volume\") pod \"a4478f6e-40e3-4160-be55-f1de6ebf5f63\" (UID: \"a4478f6e-40e3-4160-be55-f1de6ebf5f63\") " Dec 01 21:00:03 crc kubenswrapper[4802]: I1201 21:00:03.988889 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbmtk\" (UniqueName: \"kubernetes.io/projected/a4478f6e-40e3-4160-be55-f1de6ebf5f63-kube-api-access-wbmtk\") pod \"a4478f6e-40e3-4160-be55-f1de6ebf5f63\" (UID: \"a4478f6e-40e3-4160-be55-f1de6ebf5f63\") " Dec 01 21:00:03 crc kubenswrapper[4802]: I1201 21:00:03.990820 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4478f6e-40e3-4160-be55-f1de6ebf5f63-config-volume" (OuterVolumeSpecName: "config-volume") pod "a4478f6e-40e3-4160-be55-f1de6ebf5f63" (UID: "a4478f6e-40e3-4160-be55-f1de6ebf5f63"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 21:00:03 crc kubenswrapper[4802]: I1201 21:00:03.998611 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4478f6e-40e3-4160-be55-f1de6ebf5f63-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a4478f6e-40e3-4160-be55-f1de6ebf5f63" (UID: "a4478f6e-40e3-4160-be55-f1de6ebf5f63"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:00:03 crc kubenswrapper[4802]: I1201 21:00:03.998708 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4478f6e-40e3-4160-be55-f1de6ebf5f63-kube-api-access-wbmtk" (OuterVolumeSpecName: "kube-api-access-wbmtk") pod "a4478f6e-40e3-4160-be55-f1de6ebf5f63" (UID: "a4478f6e-40e3-4160-be55-f1de6ebf5f63"). InnerVolumeSpecName "kube-api-access-wbmtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:00:04 crc kubenswrapper[4802]: I1201 21:00:04.091314 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbmtk\" (UniqueName: \"kubernetes.io/projected/a4478f6e-40e3-4160-be55-f1de6ebf5f63-kube-api-access-wbmtk\") on node \"crc\" DevicePath \"\"" Dec 01 21:00:04 crc kubenswrapper[4802]: I1201 21:00:04.091353 4802 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a4478f6e-40e3-4160-be55-f1de6ebf5f63-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 21:00:04 crc kubenswrapper[4802]: I1201 21:00:04.091366 4802 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a4478f6e-40e3-4160-be55-f1de6ebf5f63-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 21:00:04 crc kubenswrapper[4802]: I1201 21:00:04.477044 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410380-bbx68" event={"ID":"a4478f6e-40e3-4160-be55-f1de6ebf5f63","Type":"ContainerDied","Data":"7dd3ec7b66f9be7bdcc8454402ec36138484cf4de558b737aba33221230c52fa"} Dec 01 21:00:04 crc kubenswrapper[4802]: I1201 21:00:04.477089 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dd3ec7b66f9be7bdcc8454402ec36138484cf4de558b737aba33221230c52fa" Dec 01 21:00:04 crc kubenswrapper[4802]: I1201 21:00:04.477178 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410380-bbx68" Dec 01 21:00:04 crc kubenswrapper[4802]: I1201 21:00:04.930821 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410335-8p7k7"] Dec 01 21:00:04 crc kubenswrapper[4802]: I1201 21:00:04.939525 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410335-8p7k7"] Dec 01 21:00:06 crc kubenswrapper[4802]: I1201 21:00:06.734890 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc4bc05f-4541-4ffe-84b9-b3b54d244094" path="/var/lib/kubelet/pods/dc4bc05f-4541-4ffe-84b9-b3b54d244094/volumes" Dec 01 21:00:08 crc kubenswrapper[4802]: I1201 21:00:08.726932 4802 scope.go:117] "RemoveContainer" containerID="4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca" Dec 01 21:00:08 crc kubenswrapper[4802]: E1201 21:00:08.727846 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 21:00:21 crc kubenswrapper[4802]: I1201 21:00:21.720375 4802 scope.go:117] "RemoveContainer" containerID="4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca" Dec 01 21:00:21 crc kubenswrapper[4802]: E1201 21:00:21.721548 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 21:00:28 crc kubenswrapper[4802]: I1201 21:00:28.782803 4802 scope.go:117] "RemoveContainer" containerID="6175fdd46891bd81af38fd579143a31ab1e715b0466c87fce1a1553120da1c14" Dec 01 21:00:28 crc kubenswrapper[4802]: I1201 21:00:28.813342 4802 scope.go:117] "RemoveContainer" containerID="dc420b054e47fa4ef46ad49e178b81450b5981ddd17e3ae49488b58443937c91" Dec 01 21:00:32 crc kubenswrapper[4802]: I1201 21:00:32.720234 4802 scope.go:117] "RemoveContainer" containerID="4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca" Dec 01 21:00:32 crc kubenswrapper[4802]: E1201 21:00:32.720998 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 21:00:43 crc kubenswrapper[4802]: I1201 21:00:43.720898 4802 scope.go:117] "RemoveContainer" containerID="4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca" Dec 01 21:00:43 crc kubenswrapper[4802]: E1201 21:00:43.721852 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 21:00:45 crc kubenswrapper[4802]: I1201 21:00:45.911974 4802 generic.go:334] "Generic (PLEG): container finished" podID="d1950264-c629-4757-b443-dcecf41ae2a1" containerID="21946d20041c4d2bb602cc4543cc6c921c42a26b9a9d1d079a3a66fcb89e41fa" exitCode=0 Dec 01 21:00:45 crc kubenswrapper[4802]: I1201 21:00:45.912086 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6rhk/must-gather-25xw2" event={"ID":"d1950264-c629-4757-b443-dcecf41ae2a1","Type":"ContainerDied","Data":"21946d20041c4d2bb602cc4543cc6c921c42a26b9a9d1d079a3a66fcb89e41fa"} Dec 01 21:00:45 crc kubenswrapper[4802]: I1201 21:00:45.914072 4802 scope.go:117] "RemoveContainer" containerID="21946d20041c4d2bb602cc4543cc6c921c42a26b9a9d1d079a3a66fcb89e41fa" Dec 01 21:00:46 crc kubenswrapper[4802]: I1201 21:00:46.468500 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t6rhk_must-gather-25xw2_d1950264-c629-4757-b443-dcecf41ae2a1/gather/0.log" Dec 01 21:00:54 crc kubenswrapper[4802]: I1201 21:00:54.191503 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t6rhk/must-gather-25xw2"] Dec 01 21:00:54 crc kubenswrapper[4802]: I1201 21:00:54.192318 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-t6rhk/must-gather-25xw2" podUID="d1950264-c629-4757-b443-dcecf41ae2a1" containerName="copy" containerID="cri-o://4e92fcd5097ac294bad9001bafd4a9e0135c3bacaf47fd2a1d8d1771064a6a26" gracePeriod=2 Dec 01 21:00:54 crc kubenswrapper[4802]: I1201 21:00:54.203126 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t6rhk/must-gather-25xw2"] Dec 01 21:00:54 crc kubenswrapper[4802]: I1201 21:00:54.602980 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t6rhk_must-gather-25xw2_d1950264-c629-4757-b443-dcecf41ae2a1/copy/0.log" Dec 01 21:00:54 crc kubenswrapper[4802]: I1201 21:00:54.603408 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6rhk/must-gather-25xw2" Dec 01 21:00:54 crc kubenswrapper[4802]: I1201 21:00:54.688551 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d1950264-c629-4757-b443-dcecf41ae2a1-must-gather-output\") pod \"d1950264-c629-4757-b443-dcecf41ae2a1\" (UID: \"d1950264-c629-4757-b443-dcecf41ae2a1\") " Dec 01 21:00:54 crc kubenswrapper[4802]: I1201 21:00:54.688955 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z29zw\" (UniqueName: \"kubernetes.io/projected/d1950264-c629-4757-b443-dcecf41ae2a1-kube-api-access-z29zw\") pod \"d1950264-c629-4757-b443-dcecf41ae2a1\" (UID: \"d1950264-c629-4757-b443-dcecf41ae2a1\") " Dec 01 21:00:54 crc kubenswrapper[4802]: I1201 21:00:54.694397 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1950264-c629-4757-b443-dcecf41ae2a1-kube-api-access-z29zw" (OuterVolumeSpecName: "kube-api-access-z29zw") pod "d1950264-c629-4757-b443-dcecf41ae2a1" (UID: "d1950264-c629-4757-b443-dcecf41ae2a1"). InnerVolumeSpecName "kube-api-access-z29zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:00:54 crc kubenswrapper[4802]: I1201 21:00:54.791323 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z29zw\" (UniqueName: \"kubernetes.io/projected/d1950264-c629-4757-b443-dcecf41ae2a1-kube-api-access-z29zw\") on node \"crc\" DevicePath \"\"" Dec 01 21:00:54 crc kubenswrapper[4802]: I1201 21:00:54.847427 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1950264-c629-4757-b443-dcecf41ae2a1-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "d1950264-c629-4757-b443-dcecf41ae2a1" (UID: "d1950264-c629-4757-b443-dcecf41ae2a1"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:00:54 crc kubenswrapper[4802]: I1201 21:00:54.894538 4802 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d1950264-c629-4757-b443-dcecf41ae2a1-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 01 21:00:54 crc kubenswrapper[4802]: I1201 21:00:54.992446 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t6rhk_must-gather-25xw2_d1950264-c629-4757-b443-dcecf41ae2a1/copy/0.log" Dec 01 21:00:54 crc kubenswrapper[4802]: I1201 21:00:54.994703 4802 generic.go:334] "Generic (PLEG): container finished" podID="d1950264-c629-4757-b443-dcecf41ae2a1" containerID="4e92fcd5097ac294bad9001bafd4a9e0135c3bacaf47fd2a1d8d1771064a6a26" exitCode=143 Dec 01 21:00:54 crc kubenswrapper[4802]: I1201 21:00:54.994781 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6rhk/must-gather-25xw2" Dec 01 21:00:54 crc kubenswrapper[4802]: I1201 21:00:54.994789 4802 scope.go:117] "RemoveContainer" containerID="4e92fcd5097ac294bad9001bafd4a9e0135c3bacaf47fd2a1d8d1771064a6a26" Dec 01 21:00:55 crc kubenswrapper[4802]: I1201 21:00:55.017279 4802 scope.go:117] "RemoveContainer" containerID="21946d20041c4d2bb602cc4543cc6c921c42a26b9a9d1d079a3a66fcb89e41fa" Dec 01 21:00:55 crc kubenswrapper[4802]: I1201 21:00:55.110142 4802 scope.go:117] "RemoveContainer" containerID="4e92fcd5097ac294bad9001bafd4a9e0135c3bacaf47fd2a1d8d1771064a6a26" Dec 01 21:00:55 crc kubenswrapper[4802]: E1201 21:00:55.110839 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e92fcd5097ac294bad9001bafd4a9e0135c3bacaf47fd2a1d8d1771064a6a26\": container with ID starting with 4e92fcd5097ac294bad9001bafd4a9e0135c3bacaf47fd2a1d8d1771064a6a26 not found: ID does not exist" containerID="4e92fcd5097ac294bad9001bafd4a9e0135c3bacaf47fd2a1d8d1771064a6a26" Dec 01 21:00:55 crc kubenswrapper[4802]: I1201 21:00:55.110885 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e92fcd5097ac294bad9001bafd4a9e0135c3bacaf47fd2a1d8d1771064a6a26"} err="failed to get container status \"4e92fcd5097ac294bad9001bafd4a9e0135c3bacaf47fd2a1d8d1771064a6a26\": rpc error: code = NotFound desc = could not find container \"4e92fcd5097ac294bad9001bafd4a9e0135c3bacaf47fd2a1d8d1771064a6a26\": container with ID starting with 4e92fcd5097ac294bad9001bafd4a9e0135c3bacaf47fd2a1d8d1771064a6a26 not found: ID does not exist" Dec 01 21:00:55 crc kubenswrapper[4802]: I1201 21:00:55.110914 4802 scope.go:117] "RemoveContainer" containerID="21946d20041c4d2bb602cc4543cc6c921c42a26b9a9d1d079a3a66fcb89e41fa" Dec 01 21:00:55 crc kubenswrapper[4802]: E1201 21:00:55.111300 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21946d20041c4d2bb602cc4543cc6c921c42a26b9a9d1d079a3a66fcb89e41fa\": container with ID starting with 21946d20041c4d2bb602cc4543cc6c921c42a26b9a9d1d079a3a66fcb89e41fa not found: ID does not exist" containerID="21946d20041c4d2bb602cc4543cc6c921c42a26b9a9d1d079a3a66fcb89e41fa" Dec 01 21:00:55 crc kubenswrapper[4802]: I1201 21:00:55.111330 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21946d20041c4d2bb602cc4543cc6c921c42a26b9a9d1d079a3a66fcb89e41fa"} err="failed to get container status \"21946d20041c4d2bb602cc4543cc6c921c42a26b9a9d1d079a3a66fcb89e41fa\": rpc error: code = NotFound desc = could not find container \"21946d20041c4d2bb602cc4543cc6c921c42a26b9a9d1d079a3a66fcb89e41fa\": container with ID starting with 21946d20041c4d2bb602cc4543cc6c921c42a26b9a9d1d079a3a66fcb89e41fa not found: ID does not exist" Dec 01 21:00:55 crc kubenswrapper[4802]: I1201 21:00:55.720270 4802 scope.go:117] "RemoveContainer" containerID="4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca" Dec 01 21:00:55 crc kubenswrapper[4802]: E1201 21:00:55.721887 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 21:00:56 crc kubenswrapper[4802]: I1201 21:00:56.730611 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1950264-c629-4757-b443-dcecf41ae2a1" path="/var/lib/kubelet/pods/d1950264-c629-4757-b443-dcecf41ae2a1/volumes" Dec 01 21:01:00 crc kubenswrapper[4802]: I1201 21:01:00.152124 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29410381-4blfs"] Dec 01 21:01:00 crc kubenswrapper[4802]: E1201 21:01:00.153298 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1950264-c629-4757-b443-dcecf41ae2a1" containerName="gather" Dec 01 21:01:00 crc kubenswrapper[4802]: I1201 21:01:00.153322 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1950264-c629-4757-b443-dcecf41ae2a1" containerName="gather" Dec 01 21:01:00 crc kubenswrapper[4802]: E1201 21:01:00.153366 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1950264-c629-4757-b443-dcecf41ae2a1" containerName="copy" Dec 01 21:01:00 crc kubenswrapper[4802]: I1201 21:01:00.153375 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1950264-c629-4757-b443-dcecf41ae2a1" containerName="copy" Dec 01 21:01:00 crc kubenswrapper[4802]: E1201 21:01:00.153395 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4478f6e-40e3-4160-be55-f1de6ebf5f63" containerName="collect-profiles" Dec 01 21:01:00 crc kubenswrapper[4802]: I1201 21:01:00.153405 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4478f6e-40e3-4160-be55-f1de6ebf5f63" containerName="collect-profiles" Dec 01 21:01:00 crc kubenswrapper[4802]: I1201 21:01:00.153598 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4478f6e-40e3-4160-be55-f1de6ebf5f63" containerName="collect-profiles" Dec 01 21:01:00 crc kubenswrapper[4802]: I1201 21:01:00.153628 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1950264-c629-4757-b443-dcecf41ae2a1" containerName="copy" Dec 01 21:01:00 crc kubenswrapper[4802]: I1201 21:01:00.153635 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1950264-c629-4757-b443-dcecf41ae2a1" containerName="gather" Dec 01 21:01:00 crc kubenswrapper[4802]: I1201 21:01:00.154331 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29410381-4blfs" Dec 01 21:01:00 crc kubenswrapper[4802]: I1201 21:01:00.172142 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29410381-4blfs"] Dec 01 21:01:00 crc kubenswrapper[4802]: I1201 21:01:00.297815 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77085b4f-6030-4e2f-ac42-0a70cb2b269c-config-data\") pod \"keystone-cron-29410381-4blfs\" (UID: \"77085b4f-6030-4e2f-ac42-0a70cb2b269c\") " pod="openstack/keystone-cron-29410381-4blfs" Dec 01 21:01:00 crc kubenswrapper[4802]: I1201 21:01:00.297881 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/77085b4f-6030-4e2f-ac42-0a70cb2b269c-fernet-keys\") pod \"keystone-cron-29410381-4blfs\" (UID: \"77085b4f-6030-4e2f-ac42-0a70cb2b269c\") " pod="openstack/keystone-cron-29410381-4blfs" Dec 01 21:01:00 crc kubenswrapper[4802]: I1201 21:01:00.297924 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77085b4f-6030-4e2f-ac42-0a70cb2b269c-combined-ca-bundle\") pod \"keystone-cron-29410381-4blfs\" (UID: \"77085b4f-6030-4e2f-ac42-0a70cb2b269c\") " pod="openstack/keystone-cron-29410381-4blfs" Dec 01 21:01:00 crc kubenswrapper[4802]: I1201 21:01:00.298035 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w542\" (UniqueName: \"kubernetes.io/projected/77085b4f-6030-4e2f-ac42-0a70cb2b269c-kube-api-access-7w542\") pod \"keystone-cron-29410381-4blfs\" (UID: \"77085b4f-6030-4e2f-ac42-0a70cb2b269c\") " pod="openstack/keystone-cron-29410381-4blfs" Dec 01 21:01:00 crc kubenswrapper[4802]: I1201 21:01:00.399955 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w542\" (UniqueName: \"kubernetes.io/projected/77085b4f-6030-4e2f-ac42-0a70cb2b269c-kube-api-access-7w542\") pod \"keystone-cron-29410381-4blfs\" (UID: \"77085b4f-6030-4e2f-ac42-0a70cb2b269c\") " pod="openstack/keystone-cron-29410381-4blfs" Dec 01 21:01:00 crc kubenswrapper[4802]: I1201 21:01:00.400350 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77085b4f-6030-4e2f-ac42-0a70cb2b269c-config-data\") pod \"keystone-cron-29410381-4blfs\" (UID: \"77085b4f-6030-4e2f-ac42-0a70cb2b269c\") " pod="openstack/keystone-cron-29410381-4blfs" Dec 01 21:01:00 crc kubenswrapper[4802]: I1201 21:01:00.400605 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/77085b4f-6030-4e2f-ac42-0a70cb2b269c-fernet-keys\") pod \"keystone-cron-29410381-4blfs\" (UID: \"77085b4f-6030-4e2f-ac42-0a70cb2b269c\") " pod="openstack/keystone-cron-29410381-4blfs" Dec 01 21:01:00 crc kubenswrapper[4802]: I1201 21:01:00.400753 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77085b4f-6030-4e2f-ac42-0a70cb2b269c-combined-ca-bundle\") pod \"keystone-cron-29410381-4blfs\" (UID: \"77085b4f-6030-4e2f-ac42-0a70cb2b269c\") " pod="openstack/keystone-cron-29410381-4blfs" Dec 01 21:01:00 crc kubenswrapper[4802]: I1201 21:01:00.407904 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/77085b4f-6030-4e2f-ac42-0a70cb2b269c-fernet-keys\") pod \"keystone-cron-29410381-4blfs\" (UID: \"77085b4f-6030-4e2f-ac42-0a70cb2b269c\") " pod="openstack/keystone-cron-29410381-4blfs" Dec 01 21:01:00 crc kubenswrapper[4802]: I1201 21:01:00.407948 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77085b4f-6030-4e2f-ac42-0a70cb2b269c-config-data\") pod \"keystone-cron-29410381-4blfs\" (UID: \"77085b4f-6030-4e2f-ac42-0a70cb2b269c\") " pod="openstack/keystone-cron-29410381-4blfs" Dec 01 21:01:00 crc kubenswrapper[4802]: I1201 21:01:00.409337 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77085b4f-6030-4e2f-ac42-0a70cb2b269c-combined-ca-bundle\") pod \"keystone-cron-29410381-4blfs\" (UID: \"77085b4f-6030-4e2f-ac42-0a70cb2b269c\") " pod="openstack/keystone-cron-29410381-4blfs" Dec 01 21:01:00 crc kubenswrapper[4802]: I1201 21:01:00.418857 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w542\" (UniqueName: \"kubernetes.io/projected/77085b4f-6030-4e2f-ac42-0a70cb2b269c-kube-api-access-7w542\") pod \"keystone-cron-29410381-4blfs\" (UID: \"77085b4f-6030-4e2f-ac42-0a70cb2b269c\") " pod="openstack/keystone-cron-29410381-4blfs" Dec 01 21:01:00 crc kubenswrapper[4802]: I1201 21:01:00.477922 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29410381-4blfs" Dec 01 21:01:00 crc kubenswrapper[4802]: I1201 21:01:00.980016 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29410381-4blfs"] Dec 01 21:01:01 crc kubenswrapper[4802]: I1201 21:01:01.072983 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29410381-4blfs" event={"ID":"77085b4f-6030-4e2f-ac42-0a70cb2b269c","Type":"ContainerStarted","Data":"615d34be1b65f66e8036eafd357cddf6e6db169bf80240a4b37f72a5477a704f"} Dec 01 21:01:02 crc kubenswrapper[4802]: I1201 21:01:02.103566 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29410381-4blfs" event={"ID":"77085b4f-6030-4e2f-ac42-0a70cb2b269c","Type":"ContainerStarted","Data":"b3118c64315c12d1167084abf3976dafb5f0d892814ab3e4b5306c5f9d870d85"} Dec 01 21:01:02 crc kubenswrapper[4802]: I1201 21:01:02.128893 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29410381-4blfs" podStartSLOduration=2.128873703 podStartE2EDuration="2.128873703s" podCreationTimestamp="2025-12-01 21:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:01:02.121764752 +0000 UTC m=+3883.684324393" watchObservedRunningTime="2025-12-01 21:01:02.128873703 +0000 UTC m=+3883.691433354" Dec 01 21:01:04 crc kubenswrapper[4802]: I1201 21:01:04.123353 4802 generic.go:334] "Generic (PLEG): container finished" podID="77085b4f-6030-4e2f-ac42-0a70cb2b269c" containerID="b3118c64315c12d1167084abf3976dafb5f0d892814ab3e4b5306c5f9d870d85" exitCode=0 Dec 01 21:01:04 crc kubenswrapper[4802]: I1201 21:01:04.123421 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29410381-4blfs" event={"ID":"77085b4f-6030-4e2f-ac42-0a70cb2b269c","Type":"ContainerDied","Data":"b3118c64315c12d1167084abf3976dafb5f0d892814ab3e4b5306c5f9d870d85"} Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.058092 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29410381-4blfs" Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.141951 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29410381-4blfs" event={"ID":"77085b4f-6030-4e2f-ac42-0a70cb2b269c","Type":"ContainerDied","Data":"615d34be1b65f66e8036eafd357cddf6e6db169bf80240a4b37f72a5477a704f"} Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.141989 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="615d34be1b65f66e8036eafd357cddf6e6db169bf80240a4b37f72a5477a704f" Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.142021 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29410381-4blfs" Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.204947 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tdc7l"] Dec 01 21:01:06 crc kubenswrapper[4802]: E1201 21:01:06.205352 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77085b4f-6030-4e2f-ac42-0a70cb2b269c" containerName="keystone-cron" Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.205372 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="77085b4f-6030-4e2f-ac42-0a70cb2b269c" containerName="keystone-cron" Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.205553 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="77085b4f-6030-4e2f-ac42-0a70cb2b269c" containerName="keystone-cron" Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.206840 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tdc7l" Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.212994 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w542\" (UniqueName: \"kubernetes.io/projected/77085b4f-6030-4e2f-ac42-0a70cb2b269c-kube-api-access-7w542\") pod \"77085b4f-6030-4e2f-ac42-0a70cb2b269c\" (UID: \"77085b4f-6030-4e2f-ac42-0a70cb2b269c\") " Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.213055 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77085b4f-6030-4e2f-ac42-0a70cb2b269c-combined-ca-bundle\") pod \"77085b4f-6030-4e2f-ac42-0a70cb2b269c\" (UID: \"77085b4f-6030-4e2f-ac42-0a70cb2b269c\") " Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.213384 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/77085b4f-6030-4e2f-ac42-0a70cb2b269c-fernet-keys\") pod \"77085b4f-6030-4e2f-ac42-0a70cb2b269c\" (UID: \"77085b4f-6030-4e2f-ac42-0a70cb2b269c\") " Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.214123 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77085b4f-6030-4e2f-ac42-0a70cb2b269c-config-data\") pod \"77085b4f-6030-4e2f-ac42-0a70cb2b269c\" (UID: \"77085b4f-6030-4e2f-ac42-0a70cb2b269c\") " Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.214978 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca1f6d6b-0145-4af4-9bd5-028f78ebf947-utilities\") pod \"redhat-marketplace-tdc7l\" (UID: \"ca1f6d6b-0145-4af4-9bd5-028f78ebf947\") " pod="openshift-marketplace/redhat-marketplace-tdc7l" Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.215064 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca1f6d6b-0145-4af4-9bd5-028f78ebf947-catalog-content\") pod \"redhat-marketplace-tdc7l\" (UID: \"ca1f6d6b-0145-4af4-9bd5-028f78ebf947\") " pod="openshift-marketplace/redhat-marketplace-tdc7l" Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.217826 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6kw9\" (UniqueName: \"kubernetes.io/projected/ca1f6d6b-0145-4af4-9bd5-028f78ebf947-kube-api-access-g6kw9\") pod \"redhat-marketplace-tdc7l\" (UID: \"ca1f6d6b-0145-4af4-9bd5-028f78ebf947\") " pod="openshift-marketplace/redhat-marketplace-tdc7l" Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.220940 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77085b4f-6030-4e2f-ac42-0a70cb2b269c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "77085b4f-6030-4e2f-ac42-0a70cb2b269c" (UID: "77085b4f-6030-4e2f-ac42-0a70cb2b269c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.221105 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tdc7l"] Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.232721 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77085b4f-6030-4e2f-ac42-0a70cb2b269c-kube-api-access-7w542" (OuterVolumeSpecName: "kube-api-access-7w542") pod "77085b4f-6030-4e2f-ac42-0a70cb2b269c" (UID: "77085b4f-6030-4e2f-ac42-0a70cb2b269c"). InnerVolumeSpecName "kube-api-access-7w542". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.257576 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77085b4f-6030-4e2f-ac42-0a70cb2b269c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77085b4f-6030-4e2f-ac42-0a70cb2b269c" (UID: "77085b4f-6030-4e2f-ac42-0a70cb2b269c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.307569 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77085b4f-6030-4e2f-ac42-0a70cb2b269c-config-data" (OuterVolumeSpecName: "config-data") pod "77085b4f-6030-4e2f-ac42-0a70cb2b269c" (UID: "77085b4f-6030-4e2f-ac42-0a70cb2b269c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.319613 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca1f6d6b-0145-4af4-9bd5-028f78ebf947-utilities\") pod \"redhat-marketplace-tdc7l\" (UID: \"ca1f6d6b-0145-4af4-9bd5-028f78ebf947\") " pod="openshift-marketplace/redhat-marketplace-tdc7l" Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.319671 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca1f6d6b-0145-4af4-9bd5-028f78ebf947-catalog-content\") pod \"redhat-marketplace-tdc7l\" (UID: \"ca1f6d6b-0145-4af4-9bd5-028f78ebf947\") " pod="openshift-marketplace/redhat-marketplace-tdc7l" Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.319781 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6kw9\" (UniqueName: \"kubernetes.io/projected/ca1f6d6b-0145-4af4-9bd5-028f78ebf947-kube-api-access-g6kw9\") pod \"redhat-marketplace-tdc7l\" (UID: \"ca1f6d6b-0145-4af4-9bd5-028f78ebf947\") " pod="openshift-marketplace/redhat-marketplace-tdc7l" Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.319927 4802 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/77085b4f-6030-4e2f-ac42-0a70cb2b269c-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.319949 4802 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77085b4f-6030-4e2f-ac42-0a70cb2b269c-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.319962 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w542\" (UniqueName: \"kubernetes.io/projected/77085b4f-6030-4e2f-ac42-0a70cb2b269c-kube-api-access-7w542\") on node \"crc\" DevicePath \"\"" Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.319975 4802 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77085b4f-6030-4e2f-ac42-0a70cb2b269c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.320154 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca1f6d6b-0145-4af4-9bd5-028f78ebf947-utilities\") pod \"redhat-marketplace-tdc7l\" (UID: \"ca1f6d6b-0145-4af4-9bd5-028f78ebf947\") " pod="openshift-marketplace/redhat-marketplace-tdc7l" Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.320287 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca1f6d6b-0145-4af4-9bd5-028f78ebf947-catalog-content\") pod \"redhat-marketplace-tdc7l\" (UID: \"ca1f6d6b-0145-4af4-9bd5-028f78ebf947\") " pod="openshift-marketplace/redhat-marketplace-tdc7l" Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.341913 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6kw9\" (UniqueName: \"kubernetes.io/projected/ca1f6d6b-0145-4af4-9bd5-028f78ebf947-kube-api-access-g6kw9\") pod \"redhat-marketplace-tdc7l\" (UID: \"ca1f6d6b-0145-4af4-9bd5-028f78ebf947\") " pod="openshift-marketplace/redhat-marketplace-tdc7l" Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.358675 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tdc7l" Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.720894 4802 scope.go:117] "RemoveContainer" containerID="4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca" Dec 01 21:01:06 crc kubenswrapper[4802]: E1201 21:01:06.721659 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 21:01:06 crc kubenswrapper[4802]: I1201 21:01:06.849566 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tdc7l"] Dec 01 21:01:07 crc kubenswrapper[4802]: I1201 21:01:07.151775 4802 generic.go:334] "Generic (PLEG): container finished" podID="ca1f6d6b-0145-4af4-9bd5-028f78ebf947" containerID="66021ca1c8f03cc0bb878e70a11392277af69f62a570fae8533374051908a772" exitCode=0 Dec 01 21:01:07 crc kubenswrapper[4802]: I1201 21:01:07.151847 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdc7l" event={"ID":"ca1f6d6b-0145-4af4-9bd5-028f78ebf947","Type":"ContainerDied","Data":"66021ca1c8f03cc0bb878e70a11392277af69f62a570fae8533374051908a772"} Dec 01 21:01:07 crc kubenswrapper[4802]: I1201 21:01:07.152075 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdc7l" event={"ID":"ca1f6d6b-0145-4af4-9bd5-028f78ebf947","Type":"ContainerStarted","Data":"5a3385abe15bd9c800b21b3f14d6d94d3f8d065ee71ea8c07f391124129cafef"} Dec 01 21:01:07 crc kubenswrapper[4802]: I1201 21:01:07.154143 4802 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 21:01:09 crc kubenswrapper[4802]: I1201 21:01:09.190363 4802 generic.go:334] "Generic (PLEG): container finished" podID="ca1f6d6b-0145-4af4-9bd5-028f78ebf947" containerID="57a9342c538ff71ec9d6c8302e669deb221762c3397e63fe34d16031d217c879" exitCode=0 Dec 01 21:01:09 crc kubenswrapper[4802]: I1201 21:01:09.190423 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdc7l" event={"ID":"ca1f6d6b-0145-4af4-9bd5-028f78ebf947","Type":"ContainerDied","Data":"57a9342c538ff71ec9d6c8302e669deb221762c3397e63fe34d16031d217c879"} Dec 01 21:01:10 crc kubenswrapper[4802]: I1201 21:01:10.202633 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdc7l" event={"ID":"ca1f6d6b-0145-4af4-9bd5-028f78ebf947","Type":"ContainerStarted","Data":"47fe65307bbaacd5d81f8696c375b1498010606acddc6bff3383729abded62bd"} Dec 01 21:01:10 crc kubenswrapper[4802]: I1201 21:01:10.220429 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tdc7l" podStartSLOduration=1.566629176 podStartE2EDuration="4.220405715s" podCreationTimestamp="2025-12-01 21:01:06 +0000 UTC" firstStartedPulling="2025-12-01 21:01:07.153807342 +0000 UTC m=+3888.716367003" lastFinishedPulling="2025-12-01 21:01:09.807583881 +0000 UTC m=+3891.370143542" observedRunningTime="2025-12-01 21:01:10.215888165 +0000 UTC m=+3891.778447806" watchObservedRunningTime="2025-12-01 21:01:10.220405715 +0000 UTC m=+3891.782965356" Dec 01 21:01:16 crc kubenswrapper[4802]: I1201 21:01:16.359761 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tdc7l" Dec 01 21:01:16 crc kubenswrapper[4802]: I1201 21:01:16.361469 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tdc7l" Dec 01 21:01:16 crc kubenswrapper[4802]: I1201 21:01:16.406899 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tdc7l" Dec 01 21:01:17 crc kubenswrapper[4802]: I1201 21:01:17.312855 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tdc7l" Dec 01 21:01:17 crc kubenswrapper[4802]: I1201 21:01:17.364406 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tdc7l"] Dec 01 21:01:18 crc kubenswrapper[4802]: I1201 21:01:18.727896 4802 scope.go:117] "RemoveContainer" containerID="4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca" Dec 01 21:01:18 crc kubenswrapper[4802]: E1201 21:01:18.728907 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 21:01:19 crc kubenswrapper[4802]: I1201 21:01:19.294234 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tdc7l" podUID="ca1f6d6b-0145-4af4-9bd5-028f78ebf947" containerName="registry-server" containerID="cri-o://47fe65307bbaacd5d81f8696c375b1498010606acddc6bff3383729abded62bd" gracePeriod=2 Dec 01 21:01:20 crc kubenswrapper[4802]: I1201 21:01:20.048273 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tdc7l" Dec 01 21:01:20 crc kubenswrapper[4802]: I1201 21:01:20.107749 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca1f6d6b-0145-4af4-9bd5-028f78ebf947-utilities\") pod \"ca1f6d6b-0145-4af4-9bd5-028f78ebf947\" (UID: \"ca1f6d6b-0145-4af4-9bd5-028f78ebf947\") " Dec 01 21:01:20 crc kubenswrapper[4802]: I1201 21:01:20.107850 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca1f6d6b-0145-4af4-9bd5-028f78ebf947-catalog-content\") pod \"ca1f6d6b-0145-4af4-9bd5-028f78ebf947\" (UID: \"ca1f6d6b-0145-4af4-9bd5-028f78ebf947\") " Dec 01 21:01:20 crc kubenswrapper[4802]: I1201 21:01:20.107906 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6kw9\" (UniqueName: \"kubernetes.io/projected/ca1f6d6b-0145-4af4-9bd5-028f78ebf947-kube-api-access-g6kw9\") pod \"ca1f6d6b-0145-4af4-9bd5-028f78ebf947\" (UID: \"ca1f6d6b-0145-4af4-9bd5-028f78ebf947\") " Dec 01 21:01:20 crc kubenswrapper[4802]: I1201 21:01:20.108845 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca1f6d6b-0145-4af4-9bd5-028f78ebf947-utilities" (OuterVolumeSpecName: "utilities") pod "ca1f6d6b-0145-4af4-9bd5-028f78ebf947" (UID: "ca1f6d6b-0145-4af4-9bd5-028f78ebf947"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:01:20 crc kubenswrapper[4802]: I1201 21:01:20.129018 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca1f6d6b-0145-4af4-9bd5-028f78ebf947-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca1f6d6b-0145-4af4-9bd5-028f78ebf947" (UID: "ca1f6d6b-0145-4af4-9bd5-028f78ebf947"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:01:20 crc kubenswrapper[4802]: I1201 21:01:20.150138 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca1f6d6b-0145-4af4-9bd5-028f78ebf947-kube-api-access-g6kw9" (OuterVolumeSpecName: "kube-api-access-g6kw9") pod "ca1f6d6b-0145-4af4-9bd5-028f78ebf947" (UID: "ca1f6d6b-0145-4af4-9bd5-028f78ebf947"). InnerVolumeSpecName "kube-api-access-g6kw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:01:20 crc kubenswrapper[4802]: I1201 21:01:20.210573 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca1f6d6b-0145-4af4-9bd5-028f78ebf947-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 21:01:20 crc kubenswrapper[4802]: I1201 21:01:20.210626 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca1f6d6b-0145-4af4-9bd5-028f78ebf947-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 21:01:20 crc kubenswrapper[4802]: I1201 21:01:20.210637 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6kw9\" (UniqueName: \"kubernetes.io/projected/ca1f6d6b-0145-4af4-9bd5-028f78ebf947-kube-api-access-g6kw9\") on node \"crc\" DevicePath \"\"" Dec 01 21:01:20 crc kubenswrapper[4802]: I1201 21:01:20.303555 4802 generic.go:334] "Generic (PLEG): container finished" podID="ca1f6d6b-0145-4af4-9bd5-028f78ebf947" containerID="47fe65307bbaacd5d81f8696c375b1498010606acddc6bff3383729abded62bd" exitCode=0 Dec 01 21:01:20 crc kubenswrapper[4802]: I1201 21:01:20.303605 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdc7l" event={"ID":"ca1f6d6b-0145-4af4-9bd5-028f78ebf947","Type":"ContainerDied","Data":"47fe65307bbaacd5d81f8696c375b1498010606acddc6bff3383729abded62bd"} Dec 01 21:01:20 crc kubenswrapper[4802]: I1201 21:01:20.303638 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tdc7l" Dec 01 21:01:20 crc kubenswrapper[4802]: I1201 21:01:20.303667 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdc7l" event={"ID":"ca1f6d6b-0145-4af4-9bd5-028f78ebf947","Type":"ContainerDied","Data":"5a3385abe15bd9c800b21b3f14d6d94d3f8d065ee71ea8c07f391124129cafef"} Dec 01 21:01:20 crc kubenswrapper[4802]: I1201 21:01:20.303705 4802 scope.go:117] "RemoveContainer" containerID="47fe65307bbaacd5d81f8696c375b1498010606acddc6bff3383729abded62bd" Dec 01 21:01:20 crc kubenswrapper[4802]: I1201 21:01:20.346766 4802 scope.go:117] "RemoveContainer" containerID="57a9342c538ff71ec9d6c8302e669deb221762c3397e63fe34d16031d217c879" Dec 01 21:01:20 crc kubenswrapper[4802]: I1201 21:01:20.350773 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tdc7l"] Dec 01 21:01:20 crc kubenswrapper[4802]: I1201 21:01:20.360804 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tdc7l"] Dec 01 21:01:20 crc kubenswrapper[4802]: I1201 21:01:20.366354 4802 scope.go:117] "RemoveContainer" containerID="66021ca1c8f03cc0bb878e70a11392277af69f62a570fae8533374051908a772" Dec 01 21:01:20 crc kubenswrapper[4802]: I1201 21:01:20.408467 4802 scope.go:117] "RemoveContainer" containerID="47fe65307bbaacd5d81f8696c375b1498010606acddc6bff3383729abded62bd" Dec 01 21:01:20 crc kubenswrapper[4802]: E1201 21:01:20.408941 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47fe65307bbaacd5d81f8696c375b1498010606acddc6bff3383729abded62bd\": container with ID starting with 47fe65307bbaacd5d81f8696c375b1498010606acddc6bff3383729abded62bd not found: ID does not exist" containerID="47fe65307bbaacd5d81f8696c375b1498010606acddc6bff3383729abded62bd" Dec 01 21:01:20 crc kubenswrapper[4802]: I1201 21:01:20.408983 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47fe65307bbaacd5d81f8696c375b1498010606acddc6bff3383729abded62bd"} err="failed to get container status \"47fe65307bbaacd5d81f8696c375b1498010606acddc6bff3383729abded62bd\": rpc error: code = NotFound desc = could not find container \"47fe65307bbaacd5d81f8696c375b1498010606acddc6bff3383729abded62bd\": container with ID starting with 47fe65307bbaacd5d81f8696c375b1498010606acddc6bff3383729abded62bd not found: ID does not exist" Dec 01 21:01:20 crc kubenswrapper[4802]: I1201 21:01:20.409011 4802 scope.go:117] "RemoveContainer" containerID="57a9342c538ff71ec9d6c8302e669deb221762c3397e63fe34d16031d217c879" Dec 01 21:01:20 crc kubenswrapper[4802]: E1201 21:01:20.409338 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57a9342c538ff71ec9d6c8302e669deb221762c3397e63fe34d16031d217c879\": container with ID starting with 57a9342c538ff71ec9d6c8302e669deb221762c3397e63fe34d16031d217c879 not found: ID does not exist" containerID="57a9342c538ff71ec9d6c8302e669deb221762c3397e63fe34d16031d217c879" Dec 01 21:01:20 crc kubenswrapper[4802]: I1201 21:01:20.409394 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57a9342c538ff71ec9d6c8302e669deb221762c3397e63fe34d16031d217c879"} err="failed to get container status \"57a9342c538ff71ec9d6c8302e669deb221762c3397e63fe34d16031d217c879\": rpc error: code = NotFound desc = could not find container \"57a9342c538ff71ec9d6c8302e669deb221762c3397e63fe34d16031d217c879\": container with ID starting with 57a9342c538ff71ec9d6c8302e669deb221762c3397e63fe34d16031d217c879 not found: ID does not exist" Dec 01 21:01:20 crc kubenswrapper[4802]: I1201 21:01:20.409424 4802 scope.go:117] "RemoveContainer" containerID="66021ca1c8f03cc0bb878e70a11392277af69f62a570fae8533374051908a772" Dec 01 21:01:20 crc kubenswrapper[4802]: E1201 21:01:20.409700 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66021ca1c8f03cc0bb878e70a11392277af69f62a570fae8533374051908a772\": container with ID starting with 66021ca1c8f03cc0bb878e70a11392277af69f62a570fae8533374051908a772 not found: ID does not exist" containerID="66021ca1c8f03cc0bb878e70a11392277af69f62a570fae8533374051908a772" Dec 01 21:01:20 crc kubenswrapper[4802]: I1201 21:01:20.409723 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66021ca1c8f03cc0bb878e70a11392277af69f62a570fae8533374051908a772"} err="failed to get container status \"66021ca1c8f03cc0bb878e70a11392277af69f62a570fae8533374051908a772\": rpc error: code = NotFound desc = could not find container \"66021ca1c8f03cc0bb878e70a11392277af69f62a570fae8533374051908a772\": container with ID starting with 66021ca1c8f03cc0bb878e70a11392277af69f62a570fae8533374051908a772 not found: ID does not exist" Dec 01 21:01:20 crc kubenswrapper[4802]: I1201 21:01:20.737715 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca1f6d6b-0145-4af4-9bd5-028f78ebf947" path="/var/lib/kubelet/pods/ca1f6d6b-0145-4af4-9bd5-028f78ebf947/volumes" Dec 01 21:01:33 crc kubenswrapper[4802]: I1201 21:01:33.719638 4802 scope.go:117] "RemoveContainer" containerID="4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca" Dec 01 21:01:34 crc kubenswrapper[4802]: I1201 21:01:34.458771 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerStarted","Data":"9d25c80dbaea60b5c692e1f1632f7e7ea1d154c74635474abccc9cbdc7b629eb"} Dec 01 21:03:22 crc kubenswrapper[4802]: I1201 21:03:22.608514 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-58wgl/must-gather-fz6ld"] Dec 01 21:03:22 crc kubenswrapper[4802]: E1201 21:03:22.609373 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca1f6d6b-0145-4af4-9bd5-028f78ebf947" containerName="extract-content" Dec 01 21:03:22 crc kubenswrapper[4802]: I1201 21:03:22.609386 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1f6d6b-0145-4af4-9bd5-028f78ebf947" containerName="extract-content" Dec 01 21:03:22 crc kubenswrapper[4802]: E1201 21:03:22.609419 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca1f6d6b-0145-4af4-9bd5-028f78ebf947" containerName="extract-utilities" Dec 01 21:03:22 crc kubenswrapper[4802]: I1201 21:03:22.609427 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1f6d6b-0145-4af4-9bd5-028f78ebf947" containerName="extract-utilities" Dec 01 21:03:22 crc kubenswrapper[4802]: E1201 21:03:22.609459 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca1f6d6b-0145-4af4-9bd5-028f78ebf947" containerName="registry-server" Dec 01 21:03:22 crc kubenswrapper[4802]: I1201 21:03:22.609466 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1f6d6b-0145-4af4-9bd5-028f78ebf947" containerName="registry-server" Dec 01 21:03:22 crc kubenswrapper[4802]: I1201 21:03:22.609661 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca1f6d6b-0145-4af4-9bd5-028f78ebf947" containerName="registry-server" Dec 01 21:03:22 crc kubenswrapper[4802]: I1201 21:03:22.610912 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58wgl/must-gather-fz6ld" Dec 01 21:03:22 crc kubenswrapper[4802]: I1201 21:03:22.613256 4802 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-58wgl"/"default-dockercfg-pnbgh" Dec 01 21:03:22 crc kubenswrapper[4802]: I1201 21:03:22.614253 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-58wgl"/"kube-root-ca.crt" Dec 01 21:03:22 crc kubenswrapper[4802]: I1201 21:03:22.617244 4802 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-58wgl"/"openshift-service-ca.crt" Dec 01 21:03:22 crc kubenswrapper[4802]: I1201 21:03:22.623759 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-58wgl/must-gather-fz6ld"] Dec 01 21:03:22 crc kubenswrapper[4802]: I1201 21:03:22.656951 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c03def70-5ce8-4c92-90f3-e6a80af5f461-must-gather-output\") pod \"must-gather-fz6ld\" (UID: \"c03def70-5ce8-4c92-90f3-e6a80af5f461\") " pod="openshift-must-gather-58wgl/must-gather-fz6ld" Dec 01 21:03:22 crc kubenswrapper[4802]: I1201 21:03:22.657297 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzb7b\" (UniqueName: \"kubernetes.io/projected/c03def70-5ce8-4c92-90f3-e6a80af5f461-kube-api-access-wzb7b\") pod \"must-gather-fz6ld\" (UID: \"c03def70-5ce8-4c92-90f3-e6a80af5f461\") " pod="openshift-must-gather-58wgl/must-gather-fz6ld" Dec 01 21:03:22 crc kubenswrapper[4802]: I1201 21:03:22.758138 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c03def70-5ce8-4c92-90f3-e6a80af5f461-must-gather-output\") pod \"must-gather-fz6ld\" (UID: \"c03def70-5ce8-4c92-90f3-e6a80af5f461\") " pod="openshift-must-gather-58wgl/must-gather-fz6ld" Dec 01 21:03:22 crc kubenswrapper[4802]: I1201 21:03:22.758303 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzb7b\" (UniqueName: \"kubernetes.io/projected/c03def70-5ce8-4c92-90f3-e6a80af5f461-kube-api-access-wzb7b\") pod \"must-gather-fz6ld\" (UID: \"c03def70-5ce8-4c92-90f3-e6a80af5f461\") " pod="openshift-must-gather-58wgl/must-gather-fz6ld" Dec 01 21:03:22 crc kubenswrapper[4802]: I1201 21:03:22.758746 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c03def70-5ce8-4c92-90f3-e6a80af5f461-must-gather-output\") pod \"must-gather-fz6ld\" (UID: \"c03def70-5ce8-4c92-90f3-e6a80af5f461\") " pod="openshift-must-gather-58wgl/must-gather-fz6ld" Dec 01 21:03:22 crc kubenswrapper[4802]: I1201 21:03:22.777732 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzb7b\" (UniqueName: \"kubernetes.io/projected/c03def70-5ce8-4c92-90f3-e6a80af5f461-kube-api-access-wzb7b\") pod \"must-gather-fz6ld\" (UID: \"c03def70-5ce8-4c92-90f3-e6a80af5f461\") " pod="openshift-must-gather-58wgl/must-gather-fz6ld" Dec 01 21:03:22 crc kubenswrapper[4802]: I1201 21:03:22.968550 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58wgl/must-gather-fz6ld" Dec 01 21:03:23 crc kubenswrapper[4802]: I1201 21:03:23.411002 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-58wgl/must-gather-fz6ld"] Dec 01 21:03:24 crc kubenswrapper[4802]: I1201 21:03:24.043878 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58wgl/must-gather-fz6ld" event={"ID":"c03def70-5ce8-4c92-90f3-e6a80af5f461","Type":"ContainerStarted","Data":"bea61adfff86fd9ba522158a42990ea30c92b65b697a2fda41be1f18b4520724"} Dec 01 21:03:24 crc kubenswrapper[4802]: I1201 21:03:24.044190 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58wgl/must-gather-fz6ld" event={"ID":"c03def70-5ce8-4c92-90f3-e6a80af5f461","Type":"ContainerStarted","Data":"c5fd9001c57456ca9d1f323369661b9030f0b526718f331020e27c6b6fa25e49"} Dec 01 21:03:24 crc kubenswrapper[4802]: I1201 21:03:24.044223 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58wgl/must-gather-fz6ld" event={"ID":"c03def70-5ce8-4c92-90f3-e6a80af5f461","Type":"ContainerStarted","Data":"a5f538debfe3adff3541597bb7025c86e7b66b76b0f303644abc836491b008fd"} Dec 01 21:03:24 crc kubenswrapper[4802]: I1201 21:03:24.063184 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-58wgl/must-gather-fz6ld" podStartSLOduration=2.063164057 podStartE2EDuration="2.063164057s" podCreationTimestamp="2025-12-01 21:03:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:03:24.054875908 +0000 UTC m=+4025.617435549" watchObservedRunningTime="2025-12-01 21:03:24.063164057 +0000 UTC m=+4025.625723698" Dec 01 21:03:27 crc kubenswrapper[4802]: I1201 21:03:27.651841 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-58wgl/crc-debug-s9sff"] Dec 01 21:03:27 crc kubenswrapper[4802]: I1201 21:03:27.653708 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58wgl/crc-debug-s9sff" Dec 01 21:03:27 crc kubenswrapper[4802]: I1201 21:03:27.794110 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l6b7\" (UniqueName: \"kubernetes.io/projected/d3b39610-7172-4f11-aceb-0ef4a767c12f-kube-api-access-6l6b7\") pod \"crc-debug-s9sff\" (UID: \"d3b39610-7172-4f11-aceb-0ef4a767c12f\") " pod="openshift-must-gather-58wgl/crc-debug-s9sff" Dec 01 21:03:27 crc kubenswrapper[4802]: I1201 21:03:27.794441 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d3b39610-7172-4f11-aceb-0ef4a767c12f-host\") pod \"crc-debug-s9sff\" (UID: \"d3b39610-7172-4f11-aceb-0ef4a767c12f\") " pod="openshift-must-gather-58wgl/crc-debug-s9sff" Dec 01 21:03:27 crc kubenswrapper[4802]: I1201 21:03:27.896617 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l6b7\" (UniqueName: \"kubernetes.io/projected/d3b39610-7172-4f11-aceb-0ef4a767c12f-kube-api-access-6l6b7\") pod \"crc-debug-s9sff\" (UID: \"d3b39610-7172-4f11-aceb-0ef4a767c12f\") " pod="openshift-must-gather-58wgl/crc-debug-s9sff" Dec 01 21:03:27 crc kubenswrapper[4802]: I1201 21:03:27.896710 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d3b39610-7172-4f11-aceb-0ef4a767c12f-host\") pod \"crc-debug-s9sff\" (UID: \"d3b39610-7172-4f11-aceb-0ef4a767c12f\") " pod="openshift-must-gather-58wgl/crc-debug-s9sff" Dec 01 21:03:27 crc kubenswrapper[4802]: I1201 21:03:27.897063 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d3b39610-7172-4f11-aceb-0ef4a767c12f-host\") pod \"crc-debug-s9sff\" (UID: \"d3b39610-7172-4f11-aceb-0ef4a767c12f\") " pod="openshift-must-gather-58wgl/crc-debug-s9sff" Dec 01 21:03:27 crc kubenswrapper[4802]: I1201 21:03:27.925762 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l6b7\" (UniqueName: \"kubernetes.io/projected/d3b39610-7172-4f11-aceb-0ef4a767c12f-kube-api-access-6l6b7\") pod \"crc-debug-s9sff\" (UID: \"d3b39610-7172-4f11-aceb-0ef4a767c12f\") " pod="openshift-must-gather-58wgl/crc-debug-s9sff" Dec 01 21:03:27 crc kubenswrapper[4802]: I1201 21:03:27.981379 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58wgl/crc-debug-s9sff" Dec 01 21:03:29 crc kubenswrapper[4802]: I1201 21:03:29.011370 4802 scope.go:117] "RemoveContainer" containerID="65d8c45d5d820fa0b45235f5349129e7536a5ba2ebcf2b3d6c5486312808ef05" Dec 01 21:03:29 crc kubenswrapper[4802]: I1201 21:03:29.098029 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58wgl/crc-debug-s9sff" event={"ID":"d3b39610-7172-4f11-aceb-0ef4a767c12f","Type":"ContainerStarted","Data":"ee68fc32f65e79ddbc20579149624bee16f2a66201673d9b086da3928054a16d"} Dec 01 21:03:29 crc kubenswrapper[4802]: I1201 21:03:29.098420 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58wgl/crc-debug-s9sff" event={"ID":"d3b39610-7172-4f11-aceb-0ef4a767c12f","Type":"ContainerStarted","Data":"0a6b8301f7ed8e78e1b0bcce2500dfd33e374648cbfb886cd74b59b41f684aa6"} Dec 01 21:03:29 crc kubenswrapper[4802]: I1201 21:03:29.116656 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-58wgl/crc-debug-s9sff" podStartSLOduration=2.116630678 podStartE2EDuration="2.116630678s" podCreationTimestamp="2025-12-01 21:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 21:03:29.109383691 +0000 UTC m=+4030.671943342" watchObservedRunningTime="2025-12-01 21:03:29.116630678 +0000 UTC m=+4030.679190319" Dec 01 21:03:58 crc kubenswrapper[4802]: I1201 21:03:58.088486 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 21:03:58 crc kubenswrapper[4802]: I1201 21:03:58.088980 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 21:04:06 crc kubenswrapper[4802]: I1201 21:04:06.386476 4802 generic.go:334] "Generic (PLEG): container finished" podID="d3b39610-7172-4f11-aceb-0ef4a767c12f" containerID="ee68fc32f65e79ddbc20579149624bee16f2a66201673d9b086da3928054a16d" exitCode=0 Dec 01 21:04:06 crc kubenswrapper[4802]: I1201 21:04:06.386693 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58wgl/crc-debug-s9sff" event={"ID":"d3b39610-7172-4f11-aceb-0ef4a767c12f","Type":"ContainerDied","Data":"ee68fc32f65e79ddbc20579149624bee16f2a66201673d9b086da3928054a16d"} Dec 01 21:04:07 crc kubenswrapper[4802]: I1201 21:04:07.520834 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58wgl/crc-debug-s9sff" Dec 01 21:04:07 crc kubenswrapper[4802]: I1201 21:04:07.559852 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-58wgl/crc-debug-s9sff"] Dec 01 21:04:07 crc kubenswrapper[4802]: I1201 21:04:07.570243 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-58wgl/crc-debug-s9sff"] Dec 01 21:04:07 crc kubenswrapper[4802]: I1201 21:04:07.608325 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6l6b7\" (UniqueName: \"kubernetes.io/projected/d3b39610-7172-4f11-aceb-0ef4a767c12f-kube-api-access-6l6b7\") pod \"d3b39610-7172-4f11-aceb-0ef4a767c12f\" (UID: \"d3b39610-7172-4f11-aceb-0ef4a767c12f\") " Dec 01 21:04:07 crc kubenswrapper[4802]: I1201 21:04:07.608758 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d3b39610-7172-4f11-aceb-0ef4a767c12f-host\") pod \"d3b39610-7172-4f11-aceb-0ef4a767c12f\" (UID: \"d3b39610-7172-4f11-aceb-0ef4a767c12f\") " Dec 01 21:04:07 crc kubenswrapper[4802]: I1201 21:04:07.608899 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3b39610-7172-4f11-aceb-0ef4a767c12f-host" (OuterVolumeSpecName: "host") pod "d3b39610-7172-4f11-aceb-0ef4a767c12f" (UID: "d3b39610-7172-4f11-aceb-0ef4a767c12f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:04:07 crc kubenswrapper[4802]: I1201 21:04:07.609537 4802 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d3b39610-7172-4f11-aceb-0ef4a767c12f-host\") on node \"crc\" DevicePath \"\"" Dec 01 21:04:07 crc kubenswrapper[4802]: I1201 21:04:07.623396 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3b39610-7172-4f11-aceb-0ef4a767c12f-kube-api-access-6l6b7" (OuterVolumeSpecName: "kube-api-access-6l6b7") pod "d3b39610-7172-4f11-aceb-0ef4a767c12f" (UID: "d3b39610-7172-4f11-aceb-0ef4a767c12f"). InnerVolumeSpecName "kube-api-access-6l6b7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:04:07 crc kubenswrapper[4802]: I1201 21:04:07.711359 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6l6b7\" (UniqueName: \"kubernetes.io/projected/d3b39610-7172-4f11-aceb-0ef4a767c12f-kube-api-access-6l6b7\") on node \"crc\" DevicePath \"\"" Dec 01 21:04:08 crc kubenswrapper[4802]: I1201 21:04:08.410363 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a6b8301f7ed8e78e1b0bcce2500dfd33e374648cbfb886cd74b59b41f684aa6" Dec 01 21:04:08 crc kubenswrapper[4802]: I1201 21:04:08.410628 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58wgl/crc-debug-s9sff" Dec 01 21:04:08 crc kubenswrapper[4802]: I1201 21:04:08.729343 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3b39610-7172-4f11-aceb-0ef4a767c12f" path="/var/lib/kubelet/pods/d3b39610-7172-4f11-aceb-0ef4a767c12f/volumes" Dec 01 21:04:08 crc kubenswrapper[4802]: I1201 21:04:08.760173 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-58wgl/crc-debug-r6cst"] Dec 01 21:04:08 crc kubenswrapper[4802]: E1201 21:04:08.760680 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3b39610-7172-4f11-aceb-0ef4a767c12f" containerName="container-00" Dec 01 21:04:08 crc kubenswrapper[4802]: I1201 21:04:08.760701 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3b39610-7172-4f11-aceb-0ef4a767c12f" containerName="container-00" Dec 01 21:04:08 crc kubenswrapper[4802]: I1201 21:04:08.760907 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3b39610-7172-4f11-aceb-0ef4a767c12f" containerName="container-00" Dec 01 21:04:08 crc kubenswrapper[4802]: I1201 21:04:08.761645 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58wgl/crc-debug-r6cst" Dec 01 21:04:08 crc kubenswrapper[4802]: I1201 21:04:08.833397 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/265175b5-cbb8-4c80-8af7-58026774cbe5-host\") pod \"crc-debug-r6cst\" (UID: \"265175b5-cbb8-4c80-8af7-58026774cbe5\") " pod="openshift-must-gather-58wgl/crc-debug-r6cst" Dec 01 21:04:08 crc kubenswrapper[4802]: I1201 21:04:08.833442 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbmqc\" (UniqueName: \"kubernetes.io/projected/265175b5-cbb8-4c80-8af7-58026774cbe5-kube-api-access-mbmqc\") pod \"crc-debug-r6cst\" (UID: \"265175b5-cbb8-4c80-8af7-58026774cbe5\") " pod="openshift-must-gather-58wgl/crc-debug-r6cst" Dec 01 21:04:08 crc kubenswrapper[4802]: I1201 21:04:08.935707 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/265175b5-cbb8-4c80-8af7-58026774cbe5-host\") pod \"crc-debug-r6cst\" (UID: \"265175b5-cbb8-4c80-8af7-58026774cbe5\") " pod="openshift-must-gather-58wgl/crc-debug-r6cst" Dec 01 21:04:08 crc kubenswrapper[4802]: I1201 21:04:08.935838 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/265175b5-cbb8-4c80-8af7-58026774cbe5-host\") pod \"crc-debug-r6cst\" (UID: \"265175b5-cbb8-4c80-8af7-58026774cbe5\") " pod="openshift-must-gather-58wgl/crc-debug-r6cst" Dec 01 21:04:08 crc kubenswrapper[4802]: I1201 21:04:08.936020 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbmqc\" (UniqueName: \"kubernetes.io/projected/265175b5-cbb8-4c80-8af7-58026774cbe5-kube-api-access-mbmqc\") pod \"crc-debug-r6cst\" (UID: \"265175b5-cbb8-4c80-8af7-58026774cbe5\") " pod="openshift-must-gather-58wgl/crc-debug-r6cst" Dec 01 21:04:09 crc kubenswrapper[4802]: I1201 21:04:09.252038 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbmqc\" (UniqueName: \"kubernetes.io/projected/265175b5-cbb8-4c80-8af7-58026774cbe5-kube-api-access-mbmqc\") pod \"crc-debug-r6cst\" (UID: \"265175b5-cbb8-4c80-8af7-58026774cbe5\") " pod="openshift-must-gather-58wgl/crc-debug-r6cst" Dec 01 21:04:09 crc kubenswrapper[4802]: I1201 21:04:09.396053 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58wgl/crc-debug-r6cst" Dec 01 21:04:10 crc kubenswrapper[4802]: I1201 21:04:10.429417 4802 generic.go:334] "Generic (PLEG): container finished" podID="265175b5-cbb8-4c80-8af7-58026774cbe5" containerID="886fecb64b19c7e447e305c1ad337e2c326f6a0ef16a7ae748bcc371b6849fea" exitCode=0 Dec 01 21:04:10 crc kubenswrapper[4802]: I1201 21:04:10.429504 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58wgl/crc-debug-r6cst" event={"ID":"265175b5-cbb8-4c80-8af7-58026774cbe5","Type":"ContainerDied","Data":"886fecb64b19c7e447e305c1ad337e2c326f6a0ef16a7ae748bcc371b6849fea"} Dec 01 21:04:10 crc kubenswrapper[4802]: I1201 21:04:10.429786 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58wgl/crc-debug-r6cst" event={"ID":"265175b5-cbb8-4c80-8af7-58026774cbe5","Type":"ContainerStarted","Data":"165af56d52950f01bf2f452d17f44423f9fa88e8af680896d1fccf4f5859084c"} Dec 01 21:04:10 crc kubenswrapper[4802]: I1201 21:04:10.898579 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-58wgl/crc-debug-r6cst"] Dec 01 21:04:10 crc kubenswrapper[4802]: I1201 21:04:10.909773 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-58wgl/crc-debug-r6cst"] Dec 01 21:04:11 crc kubenswrapper[4802]: I1201 21:04:11.535610 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58wgl/crc-debug-r6cst" Dec 01 21:04:11 crc kubenswrapper[4802]: I1201 21:04:11.580491 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbmqc\" (UniqueName: \"kubernetes.io/projected/265175b5-cbb8-4c80-8af7-58026774cbe5-kube-api-access-mbmqc\") pod \"265175b5-cbb8-4c80-8af7-58026774cbe5\" (UID: \"265175b5-cbb8-4c80-8af7-58026774cbe5\") " Dec 01 21:04:11 crc kubenswrapper[4802]: I1201 21:04:11.580638 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/265175b5-cbb8-4c80-8af7-58026774cbe5-host\") pod \"265175b5-cbb8-4c80-8af7-58026774cbe5\" (UID: \"265175b5-cbb8-4c80-8af7-58026774cbe5\") " Dec 01 21:04:11 crc kubenswrapper[4802]: I1201 21:04:11.580772 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/265175b5-cbb8-4c80-8af7-58026774cbe5-host" (OuterVolumeSpecName: "host") pod "265175b5-cbb8-4c80-8af7-58026774cbe5" (UID: "265175b5-cbb8-4c80-8af7-58026774cbe5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:04:11 crc kubenswrapper[4802]: I1201 21:04:11.581101 4802 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/265175b5-cbb8-4c80-8af7-58026774cbe5-host\") on node \"crc\" DevicePath \"\"" Dec 01 21:04:11 crc kubenswrapper[4802]: I1201 21:04:11.586058 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/265175b5-cbb8-4c80-8af7-58026774cbe5-kube-api-access-mbmqc" (OuterVolumeSpecName: "kube-api-access-mbmqc") pod "265175b5-cbb8-4c80-8af7-58026774cbe5" (UID: "265175b5-cbb8-4c80-8af7-58026774cbe5"). InnerVolumeSpecName "kube-api-access-mbmqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:04:11 crc kubenswrapper[4802]: I1201 21:04:11.683388 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbmqc\" (UniqueName: \"kubernetes.io/projected/265175b5-cbb8-4c80-8af7-58026774cbe5-kube-api-access-mbmqc\") on node \"crc\" DevicePath \"\"" Dec 01 21:04:12 crc kubenswrapper[4802]: I1201 21:04:12.446646 4802 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="165af56d52950f01bf2f452d17f44423f9fa88e8af680896d1fccf4f5859084c" Dec 01 21:04:12 crc kubenswrapper[4802]: I1201 21:04:12.446707 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58wgl/crc-debug-r6cst" Dec 01 21:04:12 crc kubenswrapper[4802]: I1201 21:04:12.688564 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-58wgl/crc-debug-fqd65"] Dec 01 21:04:12 crc kubenswrapper[4802]: E1201 21:04:12.688929 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="265175b5-cbb8-4c80-8af7-58026774cbe5" containerName="container-00" Dec 01 21:04:12 crc kubenswrapper[4802]: I1201 21:04:12.688941 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="265175b5-cbb8-4c80-8af7-58026774cbe5" containerName="container-00" Dec 01 21:04:12 crc kubenswrapper[4802]: I1201 21:04:12.689147 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="265175b5-cbb8-4c80-8af7-58026774cbe5" containerName="container-00" Dec 01 21:04:12 crc kubenswrapper[4802]: I1201 21:04:12.689762 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58wgl/crc-debug-fqd65" Dec 01 21:04:12 crc kubenswrapper[4802]: I1201 21:04:12.730724 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="265175b5-cbb8-4c80-8af7-58026774cbe5" path="/var/lib/kubelet/pods/265175b5-cbb8-4c80-8af7-58026774cbe5/volumes" Dec 01 21:04:12 crc kubenswrapper[4802]: I1201 21:04:12.803748 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5nq7\" (UniqueName: \"kubernetes.io/projected/12ce8f18-79af-4a33-91db-e39837e48d69-kube-api-access-z5nq7\") pod \"crc-debug-fqd65\" (UID: \"12ce8f18-79af-4a33-91db-e39837e48d69\") " pod="openshift-must-gather-58wgl/crc-debug-fqd65" Dec 01 21:04:12 crc kubenswrapper[4802]: I1201 21:04:12.804050 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12ce8f18-79af-4a33-91db-e39837e48d69-host\") pod \"crc-debug-fqd65\" (UID: \"12ce8f18-79af-4a33-91db-e39837e48d69\") " pod="openshift-must-gather-58wgl/crc-debug-fqd65" Dec 01 21:04:12 crc kubenswrapper[4802]: I1201 21:04:12.906180 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12ce8f18-79af-4a33-91db-e39837e48d69-host\") pod \"crc-debug-fqd65\" (UID: \"12ce8f18-79af-4a33-91db-e39837e48d69\") " pod="openshift-must-gather-58wgl/crc-debug-fqd65" Dec 01 21:04:12 crc kubenswrapper[4802]: I1201 21:04:12.906322 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5nq7\" (UniqueName: \"kubernetes.io/projected/12ce8f18-79af-4a33-91db-e39837e48d69-kube-api-access-z5nq7\") pod \"crc-debug-fqd65\" (UID: \"12ce8f18-79af-4a33-91db-e39837e48d69\") " pod="openshift-must-gather-58wgl/crc-debug-fqd65" Dec 01 21:04:12 crc kubenswrapper[4802]: I1201 21:04:12.906333 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12ce8f18-79af-4a33-91db-e39837e48d69-host\") pod \"crc-debug-fqd65\" (UID: \"12ce8f18-79af-4a33-91db-e39837e48d69\") " pod="openshift-must-gather-58wgl/crc-debug-fqd65" Dec 01 21:04:12 crc kubenswrapper[4802]: I1201 21:04:12.943222 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5nq7\" (UniqueName: \"kubernetes.io/projected/12ce8f18-79af-4a33-91db-e39837e48d69-kube-api-access-z5nq7\") pod \"crc-debug-fqd65\" (UID: \"12ce8f18-79af-4a33-91db-e39837e48d69\") " pod="openshift-must-gather-58wgl/crc-debug-fqd65" Dec 01 21:04:13 crc kubenswrapper[4802]: I1201 21:04:13.006627 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58wgl/crc-debug-fqd65" Dec 01 21:04:13 crc kubenswrapper[4802]: I1201 21:04:13.456071 4802 generic.go:334] "Generic (PLEG): container finished" podID="12ce8f18-79af-4a33-91db-e39837e48d69" containerID="4b4d638cd275ac22b06aac4341ad3a480d4ba10992124e4626e0d72f0776a060" exitCode=0 Dec 01 21:04:13 crc kubenswrapper[4802]: I1201 21:04:13.456175 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58wgl/crc-debug-fqd65" event={"ID":"12ce8f18-79af-4a33-91db-e39837e48d69","Type":"ContainerDied","Data":"4b4d638cd275ac22b06aac4341ad3a480d4ba10992124e4626e0d72f0776a060"} Dec 01 21:04:13 crc kubenswrapper[4802]: I1201 21:04:13.456432 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58wgl/crc-debug-fqd65" event={"ID":"12ce8f18-79af-4a33-91db-e39837e48d69","Type":"ContainerStarted","Data":"ce5179406477254a02d24ff992cc328ef298fed0cfe72d20ec97bcd387ced08d"} Dec 01 21:04:13 crc kubenswrapper[4802]: I1201 21:04:13.492526 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-58wgl/crc-debug-fqd65"] Dec 01 21:04:13 crc kubenswrapper[4802]: I1201 21:04:13.500675 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-58wgl/crc-debug-fqd65"] Dec 01 21:04:14 crc kubenswrapper[4802]: I1201 21:04:14.564499 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58wgl/crc-debug-fqd65" Dec 01 21:04:14 crc kubenswrapper[4802]: I1201 21:04:14.649143 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12ce8f18-79af-4a33-91db-e39837e48d69-host\") pod \"12ce8f18-79af-4a33-91db-e39837e48d69\" (UID: \"12ce8f18-79af-4a33-91db-e39837e48d69\") " Dec 01 21:04:14 crc kubenswrapper[4802]: I1201 21:04:14.649428 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5nq7\" (UniqueName: \"kubernetes.io/projected/12ce8f18-79af-4a33-91db-e39837e48d69-kube-api-access-z5nq7\") pod \"12ce8f18-79af-4a33-91db-e39837e48d69\" (UID: \"12ce8f18-79af-4a33-91db-e39837e48d69\") " Dec 01 21:04:14 crc kubenswrapper[4802]: I1201 21:04:14.649271 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12ce8f18-79af-4a33-91db-e39837e48d69-host" (OuterVolumeSpecName: "host") pod "12ce8f18-79af-4a33-91db-e39837e48d69" (UID: "12ce8f18-79af-4a33-91db-e39837e48d69"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 21:04:14 crc kubenswrapper[4802]: I1201 21:04:14.650165 4802 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12ce8f18-79af-4a33-91db-e39837e48d69-host\") on node \"crc\" DevicePath \"\"" Dec 01 21:04:14 crc kubenswrapper[4802]: I1201 21:04:14.654176 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12ce8f18-79af-4a33-91db-e39837e48d69-kube-api-access-z5nq7" (OuterVolumeSpecName: "kube-api-access-z5nq7") pod "12ce8f18-79af-4a33-91db-e39837e48d69" (UID: "12ce8f18-79af-4a33-91db-e39837e48d69"). InnerVolumeSpecName "kube-api-access-z5nq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:04:14 crc kubenswrapper[4802]: I1201 21:04:14.730995 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12ce8f18-79af-4a33-91db-e39837e48d69" path="/var/lib/kubelet/pods/12ce8f18-79af-4a33-91db-e39837e48d69/volumes" Dec 01 21:04:14 crc kubenswrapper[4802]: I1201 21:04:14.756134 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5nq7\" (UniqueName: \"kubernetes.io/projected/12ce8f18-79af-4a33-91db-e39837e48d69-kube-api-access-z5nq7\") on node \"crc\" DevicePath \"\"" Dec 01 21:04:15 crc kubenswrapper[4802]: I1201 21:04:15.473266 4802 scope.go:117] "RemoveContainer" containerID="4b4d638cd275ac22b06aac4341ad3a480d4ba10992124e4626e0d72f0776a060" Dec 01 21:04:15 crc kubenswrapper[4802]: I1201 21:04:15.473323 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58wgl/crc-debug-fqd65" Dec 01 21:04:28 crc kubenswrapper[4802]: I1201 21:04:28.088575 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 21:04:28 crc kubenswrapper[4802]: I1201 21:04:28.089181 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 21:04:58 crc kubenswrapper[4802]: I1201 21:04:58.088573 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 21:04:58 crc kubenswrapper[4802]: I1201 21:04:58.089180 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 21:04:58 crc kubenswrapper[4802]: I1201 21:04:58.089255 4802 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 21:04:58 crc kubenswrapper[4802]: I1201 21:04:58.089910 4802 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9d25c80dbaea60b5c692e1f1632f7e7ea1d154c74635474abccc9cbdc7b629eb"} pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 21:04:58 crc kubenswrapper[4802]: I1201 21:04:58.089972 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" containerID="cri-o://9d25c80dbaea60b5c692e1f1632f7e7ea1d154c74635474abccc9cbdc7b629eb" gracePeriod=600 Dec 01 21:04:58 crc kubenswrapper[4802]: I1201 21:04:58.899310 4802 generic.go:334] "Generic (PLEG): container finished" podID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerID="9d25c80dbaea60b5c692e1f1632f7e7ea1d154c74635474abccc9cbdc7b629eb" exitCode=0 Dec 01 21:04:58 crc kubenswrapper[4802]: I1201 21:04:58.899528 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerDied","Data":"9d25c80dbaea60b5c692e1f1632f7e7ea1d154c74635474abccc9cbdc7b629eb"} Dec 01 21:04:58 crc kubenswrapper[4802]: I1201 21:04:58.900115 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerStarted","Data":"8c9005f4cf58b22fc1828c37547de2dc97ee764ad4fa168137785ace7fe8bed0"} Dec 01 21:04:58 crc kubenswrapper[4802]: I1201 21:04:58.900162 4802 scope.go:117] "RemoveContainer" containerID="4b70338f4804c25a5638315e749742c4bd02b2fd7d98c4fd928a1270daf431ca" Dec 01 21:05:01 crc kubenswrapper[4802]: I1201 21:05:01.076280 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-69b4dccd58-q9lk2_2b86c57b-7125-4ead-88b7-7f5998651f39/barbican-api/0.log" Dec 01 21:05:01 crc kubenswrapper[4802]: I1201 21:05:01.126677 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-69b4dccd58-q9lk2_2b86c57b-7125-4ead-88b7-7f5998651f39/barbican-api-log/0.log" Dec 01 21:05:01 crc kubenswrapper[4802]: I1201 21:05:01.275077 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-67fcb6786-rkbj5_f9d802de-8a16-4fec-8768-b09841678cc8/barbican-keystone-listener/0.log" Dec 01 21:05:01 crc kubenswrapper[4802]: I1201 21:05:01.355145 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-67fcb6786-rkbj5_f9d802de-8a16-4fec-8768-b09841678cc8/barbican-keystone-listener-log/0.log" Dec 01 21:05:01 crc kubenswrapper[4802]: I1201 21:05:01.368926 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6f8997c475-6j472_7116c50f-a3ef-4975-9dca-2070fbdac59a/barbican-worker/0.log" Dec 01 21:05:01 crc kubenswrapper[4802]: I1201 21:05:01.542135 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6f8997c475-6j472_7116c50f-a3ef-4975-9dca-2070fbdac59a/barbican-worker-log/0.log" Dec 01 21:05:01 crc kubenswrapper[4802]: I1201 21:05:01.605996 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-sspw5_e04c0b98-f144-4917-be97-11a6b8f2b449/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 21:05:01 crc kubenswrapper[4802]: I1201 21:05:01.821997 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_bdefd2ec-84d7-4e92-adeb-969ad52e35b6/ceilometer-central-agent/0.log" Dec 01 21:05:01 crc kubenswrapper[4802]: I1201 21:05:01.859462 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_bdefd2ec-84d7-4e92-adeb-969ad52e35b6/proxy-httpd/0.log" Dec 01 21:05:01 crc kubenswrapper[4802]: I1201 21:05:01.878473 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_bdefd2ec-84d7-4e92-adeb-969ad52e35b6/ceilometer-notification-agent/0.log" Dec 01 21:05:02 crc kubenswrapper[4802]: I1201 21:05:02.009077 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_bdefd2ec-84d7-4e92-adeb-969ad52e35b6/sg-core/0.log" Dec 01 21:05:02 crc kubenswrapper[4802]: I1201 21:05:02.015709 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-qmrlp_7db48dd7-0156-4073-918a-5f4e4c1244d9/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 21:05:02 crc kubenswrapper[4802]: I1201 21:05:02.732988 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fspg2_0b81c691-0a4b-48b6-b1e1-151e1cac847c/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 21:05:02 crc kubenswrapper[4802]: I1201 21:05:02.796121 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_372989c2-e54c-4031-9b41-926f7be64266/cinder-api-log/0.log" Dec 01 21:05:02 crc kubenswrapper[4802]: I1201 21:05:02.922677 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_372989c2-e54c-4031-9b41-926f7be64266/cinder-api/0.log" Dec 01 21:05:03 crc kubenswrapper[4802]: I1201 21:05:03.138588 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_cb0c455b-a5d4-41cf-87c3-673a3deac7cb/probe/0.log" Dec 01 21:05:03 crc kubenswrapper[4802]: I1201 21:05:03.344306 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_cb0c455b-a5d4-41cf-87c3-673a3deac7cb/cinder-backup/0.log" Dec 01 21:05:03 crc kubenswrapper[4802]: I1201 21:05:03.385784 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0/cinder-scheduler/0.log" Dec 01 21:05:03 crc kubenswrapper[4802]: I1201 21:05:03.455647 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a48d1fb4-e796-4a1f-bf5b-abe1356ddbf0/probe/0.log" Dec 01 21:05:03 crc kubenswrapper[4802]: I1201 21:05:03.549767 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_83603474-dc08-4ea8-a158-cba205dab6da/probe/0.log" Dec 01 21:05:03 crc kubenswrapper[4802]: I1201 21:05:03.693259 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_83603474-dc08-4ea8-a158-cba205dab6da/cinder-volume/0.log" Dec 01 21:05:03 crc kubenswrapper[4802]: I1201 21:05:03.708586 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-g62m8_3a154c18-5d93-4d73-9e97-90fb21082eea/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 21:05:03 crc kubenswrapper[4802]: I1201 21:05:03.885505 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-cwrlz_5a9ae28b-9e09-4918-b72b-e22abd2e6dec/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 21:05:03 crc kubenswrapper[4802]: I1201 21:05:03.985797 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-jwp25_66a9cb74-956c-4846-91b9-a4dac0834347/init/0.log" Dec 01 21:05:04 crc kubenswrapper[4802]: I1201 21:05:04.135661 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-jwp25_66a9cb74-956c-4846-91b9-a4dac0834347/init/0.log" Dec 01 21:05:04 crc kubenswrapper[4802]: I1201 21:05:04.158078 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-jwp25_66a9cb74-956c-4846-91b9-a4dac0834347/dnsmasq-dns/0.log" Dec 01 21:05:04 crc kubenswrapper[4802]: I1201 21:05:04.180575 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_30bb57c2-94ae-48ad-9e68-0b595b58246b/glance-httpd/0.log" Dec 01 21:05:04 crc kubenswrapper[4802]: I1201 21:05:04.344491 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_30bb57c2-94ae-48ad-9e68-0b595b58246b/glance-log/0.log" Dec 01 21:05:04 crc kubenswrapper[4802]: I1201 21:05:04.373517 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae/glance-httpd/0.log" Dec 01 21:05:04 crc kubenswrapper[4802]: I1201 21:05:04.443644 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_fc0474b0-c4a6-4836-a27c-a70a4c3aa7ae/glance-log/0.log" Dec 01 21:05:04 crc kubenswrapper[4802]: I1201 21:05:04.730339 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-96mpg_189db1d5-3210-4707-b02f-8434a36a5791/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 21:05:04 crc kubenswrapper[4802]: I1201 21:05:04.737506 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-b69f75cb8-xrkks_40185112-89e4-49c3-9ccc-0b190724c5ff/horizon/0.log" Dec 01 21:05:04 crc kubenswrapper[4802]: I1201 21:05:04.768543 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-b69f75cb8-xrkks_40185112-89e4-49c3-9ccc-0b190724c5ff/horizon-log/0.log" Dec 01 21:05:04 crc kubenswrapper[4802]: I1201 21:05:04.891073 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-mwtvq_ba286b73-fe12-499f-b959-296217015c6b/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 21:05:05 crc kubenswrapper[4802]: I1201 21:05:05.018253 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7977dfdfb6-dnr99_f78fa699-c933-4160-b4dc-5b3db575ac17/keystone-api/0.log" Dec 01 21:05:05 crc kubenswrapper[4802]: I1201 21:05:05.083835 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29410381-4blfs_77085b4f-6030-4e2f-ac42-0a70cb2b269c/keystone-cron/0.log" Dec 01 21:05:05 crc kubenswrapper[4802]: I1201 21:05:05.195588 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_3358c4e8-0931-4d2e-82d6-527c54f3537c/kube-state-metrics/0.log" Dec 01 21:05:05 crc kubenswrapper[4802]: I1201 21:05:05.286279 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-6kbmc_bad07c68-596f-44ca-9580-335176bd8049/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 21:05:05 crc kubenswrapper[4802]: I1201 21:05:05.432473 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_cde0d25e-888c-44b9-95a0-3bdae318a8b0/manila-api-log/0.log" Dec 01 21:05:05 crc kubenswrapper[4802]: I1201 21:05:05.457269 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_cde0d25e-888c-44b9-95a0-3bdae318a8b0/manila-api/0.log" Dec 01 21:05:05 crc kubenswrapper[4802]: I1201 21:05:05.500589 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_6998a6a9-71cf-4abd-ad6c-5e46bdae11cb/probe/0.log" Dec 01 21:05:05 crc kubenswrapper[4802]: I1201 21:05:05.611052 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_6998a6a9-71cf-4abd-ad6c-5e46bdae11cb/manila-scheduler/0.log" Dec 01 21:05:05 crc kubenswrapper[4802]: I1201 21:05:05.688900 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_54773416-92b4-406d-b8f1-c78331faa64e/manila-share/0.log" Dec 01 21:05:05 crc kubenswrapper[4802]: I1201 21:05:05.703345 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_54773416-92b4-406d-b8f1-c78331faa64e/probe/0.log" Dec 01 21:05:05 crc kubenswrapper[4802]: I1201 21:05:05.966074 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-84c4b4b5d7-2ph8r_86540934-a020-4a27-bfa6-62fbc7cfe412/neutron-httpd/0.log" Dec 01 21:05:05 crc kubenswrapper[4802]: I1201 21:05:05.990118 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-84c4b4b5d7-2ph8r_86540934-a020-4a27-bfa6-62fbc7cfe412/neutron-api/0.log" Dec 01 21:05:06 crc kubenswrapper[4802]: I1201 21:05:06.217253 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-ppph7_2112a496-9a70-408c-99ec-211c3ba2defe/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 21:05:06 crc kubenswrapper[4802]: I1201 21:05:06.508032 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb/nova-api-log/0.log" Dec 01 21:05:06 crc kubenswrapper[4802]: I1201 21:05:06.722656 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_f926810d-a46a-4504-9117-1584f02f386a/nova-cell0-conductor-conductor/0.log" Dec 01 21:05:06 crc kubenswrapper[4802]: I1201 21:05:06.834429 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9ea5908c-a2c2-4a9e-a626-17bff0d2fbcb/nova-api-api/0.log" Dec 01 21:05:06 crc kubenswrapper[4802]: I1201 21:05:06.966563 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_27620668-7b86-40fb-af4b-0c2524e097a7/nova-cell1-conductor-conductor/0.log" Dec 01 21:05:07 crc kubenswrapper[4802]: I1201 21:05:07.077748 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_66cec21d-0c5f-4b29-8268-fb8f64d68bfb/nova-cell1-novncproxy-novncproxy/0.log" Dec 01 21:05:07 crc kubenswrapper[4802]: I1201 21:05:07.270809 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-q4d9z_5e27448f-4d4a-4a61-b5c7-46fb0a3aa2a4/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 21:05:07 crc kubenswrapper[4802]: I1201 21:05:07.328973 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_77a605ed-0bb9-4c8d-9a6b-86643ff44518/nova-metadata-log/0.log" Dec 01 21:05:07 crc kubenswrapper[4802]: I1201 21:05:07.694062 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_5eedf56f-d2da-4526-94fc-346c826a891d/nova-scheduler-scheduler/0.log" Dec 01 21:05:07 crc kubenswrapper[4802]: I1201 21:05:07.704384 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_00254b08-a75a-4965-8b19-f4bc8ebf6f52/mysql-bootstrap/0.log" Dec 01 21:05:07 crc kubenswrapper[4802]: I1201 21:05:07.917472 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_00254b08-a75a-4965-8b19-f4bc8ebf6f52/mysql-bootstrap/0.log" Dec 01 21:05:07 crc kubenswrapper[4802]: I1201 21:05:07.969503 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_00254b08-a75a-4965-8b19-f4bc8ebf6f52/galera/0.log" Dec 01 21:05:08 crc kubenswrapper[4802]: I1201 21:05:08.146890 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0f360e58-7047-4369-a8c8-4e0394586f62/mysql-bootstrap/0.log" Dec 01 21:05:08 crc kubenswrapper[4802]: I1201 21:05:08.264981 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0f360e58-7047-4369-a8c8-4e0394586f62/mysql-bootstrap/0.log" Dec 01 21:05:08 crc kubenswrapper[4802]: I1201 21:05:08.349085 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_06e3c630-6e2f-4fde-96ac-feea509e3dcb/memcached/0.log" Dec 01 21:05:08 crc kubenswrapper[4802]: I1201 21:05:08.358163 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0f360e58-7047-4369-a8c8-4e0394586f62/galera/0.log" Dec 01 21:05:08 crc kubenswrapper[4802]: I1201 21:05:08.473370 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_19e13a2e-794d-4757-8c64-e1895a5e819d/openstackclient/0.log" Dec 01 21:05:08 crc kubenswrapper[4802]: I1201 21:05:08.559027 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_77a605ed-0bb9-4c8d-9a6b-86643ff44518/nova-metadata-metadata/0.log" Dec 01 21:05:08 crc kubenswrapper[4802]: I1201 21:05:08.574019 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-gczlr_4bbe4b6e-302e-4d6d-bc17-5f35baca1067/ovn-controller/0.log" Dec 01 21:05:08 crc kubenswrapper[4802]: I1201 21:05:08.645349 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6p8lx_dd8140ed-9737-48ea-a0ea-15003dd90986/openstack-network-exporter/0.log" Dec 01 21:05:08 crc kubenswrapper[4802]: I1201 21:05:08.765180 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j9thx_15107e62-7679-460c-ab0e-f208b4a1ec76/ovsdb-server-init/0.log" Dec 01 21:05:08 crc kubenswrapper[4802]: I1201 21:05:08.891089 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j9thx_15107e62-7679-460c-ab0e-f208b4a1ec76/ovsdb-server/0.log" Dec 01 21:05:08 crc kubenswrapper[4802]: I1201 21:05:08.896092 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j9thx_15107e62-7679-460c-ab0e-f208b4a1ec76/ovsdb-server-init/0.log" Dec 01 21:05:08 crc kubenswrapper[4802]: I1201 21:05:08.918769 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j9thx_15107e62-7679-460c-ab0e-f208b4a1ec76/ovs-vswitchd/0.log" Dec 01 21:05:08 crc kubenswrapper[4802]: I1201 21:05:08.952212 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-zpzvj_6c2f2991-db4b-4a58-807a-f3b617d9542f/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 21:05:09 crc kubenswrapper[4802]: I1201 21:05:09.088133 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a/openstack-network-exporter/0.log" Dec 01 21:05:09 crc kubenswrapper[4802]: I1201 21:05:09.109412 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d3bbd5ca-60ac-4cc1-bdf9-acbf5329bb4a/ovn-northd/0.log" Dec 01 21:05:09 crc kubenswrapper[4802]: I1201 21:05:09.193594 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c/openstack-network-exporter/0.log" Dec 01 21:05:09 crc kubenswrapper[4802]: I1201 21:05:09.276744 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_fe3f7038-8eda-4ace-8ab0-26e6d4e5db4c/ovsdbserver-nb/0.log" Dec 01 21:05:09 crc kubenswrapper[4802]: I1201 21:05:09.280205 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9/openstack-network-exporter/0.log" Dec 01 21:05:09 crc kubenswrapper[4802]: I1201 21:05:09.346436 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e808d11c-2f7d-44fd-a3f5-e30d3c3c9cb9/ovsdbserver-sb/0.log" Dec 01 21:05:09 crc kubenswrapper[4802]: I1201 21:05:09.484504 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-646fbc85dd-2ttbm_3853723d-4452-4112-99bf-c3850a983f5d/placement-api/0.log" Dec 01 21:05:09 crc kubenswrapper[4802]: I1201 21:05:09.528573 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-646fbc85dd-2ttbm_3853723d-4452-4112-99bf-c3850a983f5d/placement-log/0.log" Dec 01 21:05:09 crc kubenswrapper[4802]: I1201 21:05:09.546419 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_d078b34a-6a2a-4ea0-b7c8-c99ff6942170/setup-container/0.log" Dec 01 21:05:09 crc kubenswrapper[4802]: I1201 21:05:09.717652 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_d078b34a-6a2a-4ea0-b7c8-c99ff6942170/setup-container/0.log" Dec 01 21:05:09 crc kubenswrapper[4802]: I1201 21:05:09.770298 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1fe488ab-29e8-4ed4-8663-be8e88c1a7ef/setup-container/0.log" Dec 01 21:05:09 crc kubenswrapper[4802]: I1201 21:05:09.771860 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_d078b34a-6a2a-4ea0-b7c8-c99ff6942170/rabbitmq/0.log" Dec 01 21:05:09 crc kubenswrapper[4802]: I1201 21:05:09.953874 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1fe488ab-29e8-4ed4-8663-be8e88c1a7ef/setup-container/0.log" Dec 01 21:05:09 crc kubenswrapper[4802]: I1201 21:05:09.970450 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1fe488ab-29e8-4ed4-8663-be8e88c1a7ef/rabbitmq/0.log" Dec 01 21:05:10 crc kubenswrapper[4802]: I1201 21:05:10.035282 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-swrmc_5daf0e64-4a00-45c3-9830-46f81436faff/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 21:05:10 crc kubenswrapper[4802]: I1201 21:05:10.141165 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-zlxwp_d5a316ab-c296-4ab8-8397-00e5a017d1cc/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 21:05:10 crc kubenswrapper[4802]: I1201 21:05:10.235235 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-8jtzl_73e55b8c-927e-43fa-9104-8db3dc67fdde/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 21:05:10 crc kubenswrapper[4802]: I1201 21:05:10.252349 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-zqvg6_73a1d762-0b3b-4abf-9072-0b5bdba7bd72/ssh-known-hosts-edpm-deployment/0.log" Dec 01 21:05:10 crc kubenswrapper[4802]: I1201 21:05:10.829592 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_fa5d0b95-078d-4cb3-a597-5af9283e6503/test-operator-logs-container/0.log" Dec 01 21:05:10 crc kubenswrapper[4802]: I1201 21:05:10.860659 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_654db8d6-c501-48bf-bfeb-81f07e7c0e2e/tempest-tests-tempest-tests-runner/0.log" Dec 01 21:05:11 crc kubenswrapper[4802]: I1201 21:05:11.052324 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-vngdm_43faaae9-0df9-4e49-a5cc-2fc51e008edc/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 21:05:29 crc kubenswrapper[4802]: I1201 21:05:29.110390 4802 scope.go:117] "RemoveContainer" containerID="f248abdd8fbcf01d0d8465de3be4f420729d604183af8cf669aa0ae052b59570" Dec 01 21:05:29 crc kubenswrapper[4802]: I1201 21:05:29.137069 4802 scope.go:117] "RemoveContainer" containerID="4e18a31b0a2a6a3bdb6a795c873831bd770c5eb707ef1c18b7448a285237f634" Dec 01 21:05:29 crc kubenswrapper[4802]: I1201 21:05:29.202453 4802 scope.go:117] "RemoveContainer" containerID="e55c26958f2e0870ea597a0bc6c2f409e9029e1c6b95b078d350525917eef443" Dec 01 21:05:34 crc kubenswrapper[4802]: I1201 21:05:34.085899 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w_d4928d2b-4f1e-4d3c-a858-8180343a7405/util/0.log" Dec 01 21:05:34 crc kubenswrapper[4802]: I1201 21:05:34.222563 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w_d4928d2b-4f1e-4d3c-a858-8180343a7405/util/0.log" Dec 01 21:05:34 crc kubenswrapper[4802]: I1201 21:05:34.235621 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w_d4928d2b-4f1e-4d3c-a858-8180343a7405/pull/0.log" Dec 01 21:05:34 crc kubenswrapper[4802]: I1201 21:05:34.293871 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w_d4928d2b-4f1e-4d3c-a858-8180343a7405/pull/0.log" Dec 01 21:05:34 crc kubenswrapper[4802]: I1201 21:05:34.431206 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w_d4928d2b-4f1e-4d3c-a858-8180343a7405/extract/0.log" Dec 01 21:05:34 crc kubenswrapper[4802]: I1201 21:05:34.440030 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w_d4928d2b-4f1e-4d3c-a858-8180343a7405/util/0.log" Dec 01 21:05:34 crc kubenswrapper[4802]: I1201 21:05:34.472876 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_76e278c9c6e53d228e0007182ebb1a9ffdd2f8044cc4b07fff1d3b7031fwk2w_d4928d2b-4f1e-4d3c-a858-8180343a7405/pull/0.log" Dec 01 21:05:34 crc kubenswrapper[4802]: I1201 21:05:34.623866 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-6d8qg_18526d53-2d4c-4c40-885c-c83b3b378260/kube-rbac-proxy/0.log" Dec 01 21:05:34 crc kubenswrapper[4802]: I1201 21:05:34.709095 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-6d8qg_18526d53-2d4c-4c40-885c-c83b3b378260/manager/0.log" Dec 01 21:05:34 crc kubenswrapper[4802]: I1201 21:05:34.756276 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-n7mjb_e067c10a-e5d4-4e57-bf14-3b0bfc8ac069/kube-rbac-proxy/0.log" Dec 01 21:05:34 crc kubenswrapper[4802]: I1201 21:05:34.879623 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-n7mjb_e067c10a-e5d4-4e57-bf14-3b0bfc8ac069/manager/0.log" Dec 01 21:05:34 crc kubenswrapper[4802]: I1201 21:05:34.894047 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-qbvvz_b01ea1d5-0409-4c32-bb34-1b88253ceb05/kube-rbac-proxy/0.log" Dec 01 21:05:34 crc kubenswrapper[4802]: I1201 21:05:34.931393 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-qbvvz_b01ea1d5-0409-4c32-bb34-1b88253ceb05/manager/0.log" Dec 01 21:05:35 crc kubenswrapper[4802]: I1201 21:05:35.077701 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-776c976b46-x7bkb_eb553ce8-f696-4c6b-a745-aa1faa5f9356/kube-rbac-proxy/0.log" Dec 01 21:05:35 crc kubenswrapper[4802]: I1201 21:05:35.143963 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-776c976b46-x7bkb_eb553ce8-f696-4c6b-a745-aa1faa5f9356/manager/0.log" Dec 01 21:05:35 crc kubenswrapper[4802]: I1201 21:05:35.257413 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-5ftkv_8626cbeb-8604-4371-b936-99cab8d76742/kube-rbac-proxy/0.log" Dec 01 21:05:35 crc kubenswrapper[4802]: I1201 21:05:35.269420 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-5ftkv_8626cbeb-8604-4371-b936-99cab8d76742/manager/0.log" Dec 01 21:05:35 crc kubenswrapper[4802]: I1201 21:05:35.394037 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-kmtdj_e5c436a3-2237-4f02-a9fc-b2aae90ce3b1/kube-rbac-proxy/0.log" Dec 01 21:05:35 crc kubenswrapper[4802]: I1201 21:05:35.459694 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-kmtdj_e5c436a3-2237-4f02-a9fc-b2aae90ce3b1/manager/0.log" Dec 01 21:05:35 crc kubenswrapper[4802]: I1201 21:05:35.533069 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-jbfz7_b40759e9-9a00-445c-964e-09f1d539d85e/kube-rbac-proxy/0.log" Dec 01 21:05:35 crc kubenswrapper[4802]: I1201 21:05:35.689694 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-jcmlp_74aa06c0-a03f-4719-b751-a77ab3d472f2/kube-rbac-proxy/0.log" Dec 01 21:05:35 crc kubenswrapper[4802]: I1201 21:05:35.732925 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-jbfz7_b40759e9-9a00-445c-964e-09f1d539d85e/manager/0.log" Dec 01 21:05:35 crc kubenswrapper[4802]: I1201 21:05:35.740084 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-jcmlp_74aa06c0-a03f-4719-b751-a77ab3d472f2/manager/0.log" Dec 01 21:05:35 crc kubenswrapper[4802]: I1201 21:05:35.853157 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-2nm2t_0e1b6ed3-9b66-4279-9a9f-0685037df9c3/kube-rbac-proxy/0.log" Dec 01 21:05:35 crc kubenswrapper[4802]: I1201 21:05:35.961125 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-2nm2t_0e1b6ed3-9b66-4279-9a9f-0685037df9c3/manager/0.log" Dec 01 21:05:36 crc kubenswrapper[4802]: I1201 21:05:36.064380 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-7stkq_1891b769-8e7e-4375-b3ea-421a23fb7af4/kube-rbac-proxy/0.log" Dec 01 21:05:36 crc kubenswrapper[4802]: I1201 21:05:36.100231 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-7stkq_1891b769-8e7e-4375-b3ea-421a23fb7af4/manager/0.log" Dec 01 21:05:36 crc kubenswrapper[4802]: I1201 21:05:36.150113 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-p475v_6973effc-3f05-43cd-ba03-b9efe3b6db1d/kube-rbac-proxy/0.log" Dec 01 21:05:36 crc kubenswrapper[4802]: I1201 21:05:36.271596 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-p475v_6973effc-3f05-43cd-ba03-b9efe3b6db1d/manager/0.log" Dec 01 21:05:36 crc kubenswrapper[4802]: I1201 21:05:36.323938 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-cswfs_e58c799d-fcaa-4d9b-aa6c-c8947774bd2e/kube-rbac-proxy/0.log" Dec 01 21:05:36 crc kubenswrapper[4802]: I1201 21:05:36.391751 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-cswfs_e58c799d-fcaa-4d9b-aa6c-c8947774bd2e/manager/0.log" Dec 01 21:05:36 crc kubenswrapper[4802]: I1201 21:05:36.600845 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-djvhl_c7839b31-af95-4d33-a954-9615ea0c87a6/kube-rbac-proxy/0.log" Dec 01 21:05:36 crc kubenswrapper[4802]: I1201 21:05:36.627547 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-djvhl_c7839b31-af95-4d33-a954-9615ea0c87a6/manager/0.log" Dec 01 21:05:36 crc kubenswrapper[4802]: I1201 21:05:36.700485 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-vb97q_8e5dddc5-34ff-4a71-a626-3c9cea7ef30f/kube-rbac-proxy/0.log" Dec 01 21:05:36 crc kubenswrapper[4802]: I1201 21:05:36.843665 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-vb97q_8e5dddc5-34ff-4a71-a626-3c9cea7ef30f/manager/0.log" Dec 01 21:05:36 crc kubenswrapper[4802]: I1201 21:05:36.877905 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4552rv_d12b9eb3-946b-4578-8630-4cb6643ab36f/kube-rbac-proxy/0.log" Dec 01 21:05:36 crc kubenswrapper[4802]: I1201 21:05:36.913251 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4552rv_d12b9eb3-946b-4578-8630-4cb6643ab36f/manager/0.log" Dec 01 21:05:37 crc kubenswrapper[4802]: I1201 21:05:37.370252 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-t7nmz_47078ba5-f704-4077-93af-c0afffa2070f/registry-server/0.log" Dec 01 21:05:37 crc kubenswrapper[4802]: I1201 21:05:37.376018 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-849fbcc767-rv5gz_b26cce34-e8fa-4145-a0dd-daa30dfdde81/operator/0.log" Dec 01 21:05:37 crc kubenswrapper[4802]: I1201 21:05:37.563707 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-lfh62_0934b0fd-8a48-4dee-b668-08c7b631551f/kube-rbac-proxy/0.log" Dec 01 21:05:37 crc kubenswrapper[4802]: I1201 21:05:37.704952 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-lfh62_0934b0fd-8a48-4dee-b668-08c7b631551f/manager/0.log" Dec 01 21:05:37 crc kubenswrapper[4802]: I1201 21:05:37.839546 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-45m8m_088be214-85a6-4cb1-9e02-fcde44abb492/manager/0.log" Dec 01 21:05:37 crc kubenswrapper[4802]: I1201 21:05:37.853030 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-45m8m_088be214-85a6-4cb1-9e02-fcde44abb492/kube-rbac-proxy/0.log" Dec 01 21:05:38 crc kubenswrapper[4802]: I1201 21:05:38.084430 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-mnwdn_a3350b6c-2091-4a61-a78e-5a1bcdfd11cf/operator/0.log" Dec 01 21:05:38 crc kubenswrapper[4802]: I1201 21:05:38.086588 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-trddt_bb89e7bb-899f-4f3e-80cd-833fbc74db85/kube-rbac-proxy/0.log" Dec 01 21:05:38 crc kubenswrapper[4802]: I1201 21:05:38.137178 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-trddt_bb89e7bb-899f-4f3e-80cd-833fbc74db85/manager/0.log" Dec 01 21:05:38 crc kubenswrapper[4802]: I1201 21:05:38.288957 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-8b6ht_369e7da7-22d9-470f-9ad0-48472ceffde4/kube-rbac-proxy/0.log" Dec 01 21:05:38 crc kubenswrapper[4802]: I1201 21:05:38.344790 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-79774867dd-5sjpr_aebbca29-71df-4bef-8108-66b226259a58/manager/0.log" Dec 01 21:05:38 crc kubenswrapper[4802]: I1201 21:05:38.402151 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-8b6ht_369e7da7-22d9-470f-9ad0-48472ceffde4/manager/0.log" Dec 01 21:05:38 crc kubenswrapper[4802]: I1201 21:05:38.806327 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-wx6ct_92132c51-643c-4442-adf2-897bd2825fdf/manager/0.log" Dec 01 21:05:38 crc kubenswrapper[4802]: I1201 21:05:38.887289 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-r89vq_97d3762b-15ce-45aa-9767-5be47c85e039/kube-rbac-proxy/0.log" Dec 01 21:05:38 crc kubenswrapper[4802]: I1201 21:05:38.914159 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-r89vq_97d3762b-15ce-45aa-9767-5be47c85e039/manager/0.log" Dec 01 21:05:38 crc kubenswrapper[4802]: I1201 21:05:38.924506 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-wx6ct_92132c51-643c-4442-adf2-897bd2825fdf/kube-rbac-proxy/0.log" Dec 01 21:05:59 crc kubenswrapper[4802]: I1201 21:05:59.035040 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-287bb_d8d9c52e-a041-4e4c-a364-ef09f105a206/control-plane-machine-set-operator/0.log" Dec 01 21:05:59 crc kubenswrapper[4802]: I1201 21:05:59.211372 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mx65z_d2c2d1ec-c588-4247-aae2-c228404a38e0/kube-rbac-proxy/0.log" Dec 01 21:05:59 crc kubenswrapper[4802]: I1201 21:05:59.227678 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mx65z_d2c2d1ec-c588-4247-aae2-c228404a38e0/machine-api-operator/0.log" Dec 01 21:06:12 crc kubenswrapper[4802]: I1201 21:06:12.841858 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-7zjnt_07ba6850-9e9e-42d2-bd61-dc97bc185119/cert-manager-controller/0.log" Dec 01 21:06:12 crc kubenswrapper[4802]: I1201 21:06:12.957172 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-r5lnb_2608cc8e-13d1-43b6-b033-1b62df0333fb/cert-manager-cainjector/0.log" Dec 01 21:06:12 crc kubenswrapper[4802]: I1201 21:06:12.973116 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-p86w8_b8339c52-f023-4f4c-9cf2-948f94a27e7a/cert-manager-webhook/0.log" Dec 01 21:06:24 crc kubenswrapper[4802]: I1201 21:06:24.543359 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-tlx97_8a21a68c-7399-4632-9564-1c0650125ea5/nmstate-console-plugin/0.log" Dec 01 21:06:24 crc kubenswrapper[4802]: I1201 21:06:24.673590 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-pkwpd_df60afec-8603-441f-88bb-31d054b7fea5/nmstate-handler/0.log" Dec 01 21:06:24 crc kubenswrapper[4802]: I1201 21:06:24.699097 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-8g77d_9923b241-3a3d-4051-b5c4-6677dff519ed/nmstate-metrics/0.log" Dec 01 21:06:24 crc kubenswrapper[4802]: I1201 21:06:24.721922 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-8g77d_9923b241-3a3d-4051-b5c4-6677dff519ed/kube-rbac-proxy/0.log" Dec 01 21:06:24 crc kubenswrapper[4802]: I1201 21:06:24.863545 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-n9pjm_75a57a90-06f5-444e-897d-1191d7838e8b/nmstate-operator/0.log" Dec 01 21:06:24 crc kubenswrapper[4802]: I1201 21:06:24.905815 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-9kcsk_b504895d-c4d4-4261-ab7d-24532e127650/nmstate-webhook/0.log" Dec 01 21:06:39 crc kubenswrapper[4802]: I1201 21:06:39.138548 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-x5zbd_36c33153-2c15-48db-9ab8-a52854a85093/controller/0.log" Dec 01 21:06:39 crc kubenswrapper[4802]: I1201 21:06:39.198381 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-x5zbd_36c33153-2c15-48db-9ab8-a52854a85093/kube-rbac-proxy/0.log" Dec 01 21:06:39 crc kubenswrapper[4802]: I1201 21:06:39.275691 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/cp-frr-files/0.log" Dec 01 21:06:39 crc kubenswrapper[4802]: I1201 21:06:39.446269 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/cp-frr-files/0.log" Dec 01 21:06:39 crc kubenswrapper[4802]: I1201 21:06:39.461521 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/cp-metrics/0.log" Dec 01 21:06:39 crc kubenswrapper[4802]: I1201 21:06:39.473445 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/cp-reloader/0.log" Dec 01 21:06:39 crc kubenswrapper[4802]: I1201 21:06:39.511900 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/cp-reloader/0.log" Dec 01 21:06:39 crc kubenswrapper[4802]: I1201 21:06:39.623633 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/cp-frr-files/0.log" Dec 01 21:06:39 crc kubenswrapper[4802]: I1201 21:06:39.628029 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/cp-reloader/0.log" Dec 01 21:06:39 crc kubenswrapper[4802]: I1201 21:06:39.650127 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/cp-metrics/0.log" Dec 01 21:06:39 crc kubenswrapper[4802]: I1201 21:06:39.686916 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/cp-metrics/0.log" Dec 01 21:06:39 crc kubenswrapper[4802]: I1201 21:06:39.840691 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/cp-reloader/0.log" Dec 01 21:06:39 crc kubenswrapper[4802]: I1201 21:06:39.848464 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/cp-frr-files/0.log" Dec 01 21:06:39 crc kubenswrapper[4802]: I1201 21:06:39.861311 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/cp-metrics/0.log" Dec 01 21:06:39 crc kubenswrapper[4802]: I1201 21:06:39.867042 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/controller/0.log" Dec 01 21:06:40 crc kubenswrapper[4802]: I1201 21:06:40.508161 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/frr-metrics/0.log" Dec 01 21:06:40 crc kubenswrapper[4802]: I1201 21:06:40.535008 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/kube-rbac-proxy-frr/0.log" Dec 01 21:06:40 crc kubenswrapper[4802]: I1201 21:06:40.568466 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/kube-rbac-proxy/0.log" Dec 01 21:06:40 crc kubenswrapper[4802]: I1201 21:06:40.714846 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/reloader/0.log" Dec 01 21:06:40 crc kubenswrapper[4802]: I1201 21:06:40.798792 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-5rvzc_14b6cfe2-8222-45da-808e-2a3d64d13b94/frr-k8s-webhook-server/0.log" Dec 01 21:06:40 crc kubenswrapper[4802]: I1201 21:06:40.975786 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-79cd97f75d-bjkc2_27545afc-bda6-468c-b9c2-8ab3182546c8/manager/0.log" Dec 01 21:06:41 crc kubenswrapper[4802]: I1201 21:06:41.228546 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-d94989d67-mpm9p_0ca0e887-c648-46c5-941a-96fc3a8e551e/webhook-server/0.log" Dec 01 21:06:41 crc kubenswrapper[4802]: I1201 21:06:41.296584 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zjgml_4c77b924-7d9d-48b6-9e00-476f7df7104c/kube-rbac-proxy/0.log" Dec 01 21:06:41 crc kubenswrapper[4802]: I1201 21:06:41.803301 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zjgml_4c77b924-7d9d-48b6-9e00-476f7df7104c/speaker/0.log" Dec 01 21:06:42 crc kubenswrapper[4802]: I1201 21:06:42.169026 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-btwhk_ca5ccff0-46eb-46dd-aa6b-a0069276275d/frr/0.log" Dec 01 21:06:50 crc kubenswrapper[4802]: I1201 21:06:50.232308 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d5chn"] Dec 01 21:06:50 crc kubenswrapper[4802]: E1201 21:06:50.233335 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ce8f18-79af-4a33-91db-e39837e48d69" containerName="container-00" Dec 01 21:06:50 crc kubenswrapper[4802]: I1201 21:06:50.233351 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ce8f18-79af-4a33-91db-e39837e48d69" containerName="container-00" Dec 01 21:06:50 crc kubenswrapper[4802]: I1201 21:06:50.233653 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="12ce8f18-79af-4a33-91db-e39837e48d69" containerName="container-00" Dec 01 21:06:50 crc kubenswrapper[4802]: I1201 21:06:50.255752 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d5chn" Dec 01 21:06:50 crc kubenswrapper[4802]: I1201 21:06:50.268759 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d5chn"] Dec 01 21:06:50 crc kubenswrapper[4802]: I1201 21:06:50.353057 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj4bc\" (UniqueName: \"kubernetes.io/projected/5f1c282e-e2a5-40b2-9496-89e5e37f615b-kube-api-access-sj4bc\") pod \"redhat-operators-d5chn\" (UID: \"5f1c282e-e2a5-40b2-9496-89e5e37f615b\") " pod="openshift-marketplace/redhat-operators-d5chn" Dec 01 21:06:50 crc kubenswrapper[4802]: I1201 21:06:50.353287 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f1c282e-e2a5-40b2-9496-89e5e37f615b-utilities\") pod \"redhat-operators-d5chn\" (UID: \"5f1c282e-e2a5-40b2-9496-89e5e37f615b\") " pod="openshift-marketplace/redhat-operators-d5chn" Dec 01 21:06:50 crc kubenswrapper[4802]: I1201 21:06:50.353433 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f1c282e-e2a5-40b2-9496-89e5e37f615b-catalog-content\") pod \"redhat-operators-d5chn\" (UID: \"5f1c282e-e2a5-40b2-9496-89e5e37f615b\") " pod="openshift-marketplace/redhat-operators-d5chn" Dec 01 21:06:50 crc kubenswrapper[4802]: I1201 21:06:50.455936 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f1c282e-e2a5-40b2-9496-89e5e37f615b-utilities\") pod \"redhat-operators-d5chn\" (UID: \"5f1c282e-e2a5-40b2-9496-89e5e37f615b\") " pod="openshift-marketplace/redhat-operators-d5chn" Dec 01 21:06:50 crc kubenswrapper[4802]: I1201 21:06:50.456045 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f1c282e-e2a5-40b2-9496-89e5e37f615b-catalog-content\") pod \"redhat-operators-d5chn\" (UID: \"5f1c282e-e2a5-40b2-9496-89e5e37f615b\") " pod="openshift-marketplace/redhat-operators-d5chn" Dec 01 21:06:50 crc kubenswrapper[4802]: I1201 21:06:50.456104 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj4bc\" (UniqueName: \"kubernetes.io/projected/5f1c282e-e2a5-40b2-9496-89e5e37f615b-kube-api-access-sj4bc\") pod \"redhat-operators-d5chn\" (UID: \"5f1c282e-e2a5-40b2-9496-89e5e37f615b\") " pod="openshift-marketplace/redhat-operators-d5chn" Dec 01 21:06:50 crc kubenswrapper[4802]: I1201 21:06:50.456684 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f1c282e-e2a5-40b2-9496-89e5e37f615b-utilities\") pod \"redhat-operators-d5chn\" (UID: \"5f1c282e-e2a5-40b2-9496-89e5e37f615b\") " pod="openshift-marketplace/redhat-operators-d5chn" Dec 01 21:06:50 crc kubenswrapper[4802]: I1201 21:06:50.456691 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f1c282e-e2a5-40b2-9496-89e5e37f615b-catalog-content\") pod \"redhat-operators-d5chn\" (UID: \"5f1c282e-e2a5-40b2-9496-89e5e37f615b\") " pod="openshift-marketplace/redhat-operators-d5chn" Dec 01 21:06:50 crc kubenswrapper[4802]: I1201 21:06:50.651412 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj4bc\" (UniqueName: \"kubernetes.io/projected/5f1c282e-e2a5-40b2-9496-89e5e37f615b-kube-api-access-sj4bc\") pod \"redhat-operators-d5chn\" (UID: \"5f1c282e-e2a5-40b2-9496-89e5e37f615b\") " pod="openshift-marketplace/redhat-operators-d5chn" Dec 01 21:06:50 crc kubenswrapper[4802]: I1201 21:06:50.895727 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d5chn" Dec 01 21:06:51 crc kubenswrapper[4802]: I1201 21:06:51.335094 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d5chn"] Dec 01 21:06:51 crc kubenswrapper[4802]: I1201 21:06:51.936243 4802 generic.go:334] "Generic (PLEG): container finished" podID="5f1c282e-e2a5-40b2-9496-89e5e37f615b" containerID="a8e3b622419a21326afbdb81c2ad2c42d32a9aa5d8cc2f96c5e5821bd4f76d62" exitCode=0 Dec 01 21:06:51 crc kubenswrapper[4802]: I1201 21:06:51.936307 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d5chn" event={"ID":"5f1c282e-e2a5-40b2-9496-89e5e37f615b","Type":"ContainerDied","Data":"a8e3b622419a21326afbdb81c2ad2c42d32a9aa5d8cc2f96c5e5821bd4f76d62"} Dec 01 21:06:51 crc kubenswrapper[4802]: I1201 21:06:51.936980 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d5chn" event={"ID":"5f1c282e-e2a5-40b2-9496-89e5e37f615b","Type":"ContainerStarted","Data":"bc034d159a2f45c30342a55edb62cb192e2acb59d82fc337ba9a1a878a456cb4"} Dec 01 21:06:51 crc kubenswrapper[4802]: I1201 21:06:51.939653 4802 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 21:06:53 crc kubenswrapper[4802]: I1201 21:06:53.959488 4802 generic.go:334] "Generic (PLEG): container finished" podID="5f1c282e-e2a5-40b2-9496-89e5e37f615b" containerID="5cdfdb1fb14dd0f502b8b6b82058389c8b928bf9cd85d2d3cc77e77f838d5861" exitCode=0 Dec 01 21:06:53 crc kubenswrapper[4802]: I1201 21:06:53.959626 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d5chn" event={"ID":"5f1c282e-e2a5-40b2-9496-89e5e37f615b","Type":"ContainerDied","Data":"5cdfdb1fb14dd0f502b8b6b82058389c8b928bf9cd85d2d3cc77e77f838d5861"} Dec 01 21:06:54 crc kubenswrapper[4802]: I1201 21:06:54.146615 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7_cbe6c903-91e9-4c8d-b378-4be5220c8ab0/util/0.log" Dec 01 21:06:54 crc kubenswrapper[4802]: I1201 21:06:54.309867 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7_cbe6c903-91e9-4c8d-b378-4be5220c8ab0/pull/0.log" Dec 01 21:06:54 crc kubenswrapper[4802]: I1201 21:06:54.331895 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7_cbe6c903-91e9-4c8d-b378-4be5220c8ab0/pull/0.log" Dec 01 21:06:54 crc kubenswrapper[4802]: I1201 21:06:54.346942 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7_cbe6c903-91e9-4c8d-b378-4be5220c8ab0/util/0.log" Dec 01 21:06:54 crc kubenswrapper[4802]: I1201 21:06:54.463497 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7_cbe6c903-91e9-4c8d-b378-4be5220c8ab0/util/0.log" Dec 01 21:06:54 crc kubenswrapper[4802]: I1201 21:06:54.523750 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7_cbe6c903-91e9-4c8d-b378-4be5220c8ab0/pull/0.log" Dec 01 21:06:54 crc kubenswrapper[4802]: I1201 21:06:54.555823 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fvqrb7_cbe6c903-91e9-4c8d-b378-4be5220c8ab0/extract/0.log" Dec 01 21:06:54 crc kubenswrapper[4802]: I1201 21:06:54.652494 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s_f0139070-b593-45e2-9098-969b27b2be38/util/0.log" Dec 01 21:06:54 crc kubenswrapper[4802]: I1201 21:06:54.777367 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s_f0139070-b593-45e2-9098-969b27b2be38/util/0.log" Dec 01 21:06:54 crc kubenswrapper[4802]: I1201 21:06:54.825356 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s_f0139070-b593-45e2-9098-969b27b2be38/pull/0.log" Dec 01 21:06:54 crc kubenswrapper[4802]: I1201 21:06:54.847220 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s_f0139070-b593-45e2-9098-969b27b2be38/pull/0.log" Dec 01 21:06:54 crc kubenswrapper[4802]: I1201 21:06:54.970100 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d5chn" event={"ID":"5f1c282e-e2a5-40b2-9496-89e5e37f615b","Type":"ContainerStarted","Data":"4db6bb55f64478a3100d6362e4c23e676b8498bc702e569c71546bb718bc3769"} Dec 01 21:06:55 crc kubenswrapper[4802]: I1201 21:06:55.034368 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s_f0139070-b593-45e2-9098-969b27b2be38/util/0.log" Dec 01 21:06:55 crc kubenswrapper[4802]: I1201 21:06:55.041478 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s_f0139070-b593-45e2-9098-969b27b2be38/pull/0.log" Dec 01 21:06:55 crc kubenswrapper[4802]: I1201 21:06:55.041584 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d5chn" podStartSLOduration=2.321071173 podStartE2EDuration="5.041565607s" podCreationTimestamp="2025-12-01 21:06:50 +0000 UTC" firstStartedPulling="2025-12-01 21:06:51.939407668 +0000 UTC m=+4233.501967309" lastFinishedPulling="2025-12-01 21:06:54.659902102 +0000 UTC m=+4236.222461743" observedRunningTime="2025-12-01 21:06:55.038888574 +0000 UTC m=+4236.601448215" watchObservedRunningTime="2025-12-01 21:06:55.041565607 +0000 UTC m=+4236.604125248" Dec 01 21:06:55 crc kubenswrapper[4802]: I1201 21:06:55.107818 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832tn9s_f0139070-b593-45e2-9098-969b27b2be38/extract/0.log" Dec 01 21:06:55 crc kubenswrapper[4802]: I1201 21:06:55.230737 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q86qm_a3b32805-cd2f-493c-a2be-2d993678cd06/extract-utilities/0.log" Dec 01 21:06:55 crc kubenswrapper[4802]: I1201 21:06:55.434114 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q86qm_a3b32805-cd2f-493c-a2be-2d993678cd06/extract-content/0.log" Dec 01 21:06:55 crc kubenswrapper[4802]: I1201 21:06:55.460386 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q86qm_a3b32805-cd2f-493c-a2be-2d993678cd06/extract-utilities/0.log" Dec 01 21:06:55 crc kubenswrapper[4802]: I1201 21:06:55.480052 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q86qm_a3b32805-cd2f-493c-a2be-2d993678cd06/extract-content/0.log" Dec 01 21:06:55 crc kubenswrapper[4802]: I1201 21:06:55.617375 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q86qm_a3b32805-cd2f-493c-a2be-2d993678cd06/extract-utilities/0.log" Dec 01 21:06:55 crc kubenswrapper[4802]: I1201 21:06:55.674688 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q86qm_a3b32805-cd2f-493c-a2be-2d993678cd06/extract-content/0.log" Dec 01 21:06:55 crc kubenswrapper[4802]: I1201 21:06:55.885885 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wcdzr_4f8205a1-38e0-47e3-be8f-5add6cdac5cb/extract-utilities/0.log" Dec 01 21:06:56 crc kubenswrapper[4802]: I1201 21:06:56.166168 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wcdzr_4f8205a1-38e0-47e3-be8f-5add6cdac5cb/extract-content/0.log" Dec 01 21:06:56 crc kubenswrapper[4802]: I1201 21:06:56.182989 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q86qm_a3b32805-cd2f-493c-a2be-2d993678cd06/registry-server/0.log" Dec 01 21:06:56 crc kubenswrapper[4802]: I1201 21:06:56.210235 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wcdzr_4f8205a1-38e0-47e3-be8f-5add6cdac5cb/extract-utilities/0.log" Dec 01 21:06:56 crc kubenswrapper[4802]: I1201 21:06:56.230128 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wcdzr_4f8205a1-38e0-47e3-be8f-5add6cdac5cb/extract-content/0.log" Dec 01 21:06:56 crc kubenswrapper[4802]: I1201 21:06:56.367519 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wcdzr_4f8205a1-38e0-47e3-be8f-5add6cdac5cb/extract-utilities/0.log" Dec 01 21:06:56 crc kubenswrapper[4802]: I1201 21:06:56.407267 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wcdzr_4f8205a1-38e0-47e3-be8f-5add6cdac5cb/extract-content/0.log" Dec 01 21:06:56 crc kubenswrapper[4802]: I1201 21:06:56.596833 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-fpwgt_4fffad75-c42a-40d4-a2f3-d770091b01fa/marketplace-operator/0.log" Dec 01 21:06:56 crc kubenswrapper[4802]: I1201 21:06:56.673124 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fgxlr_038eb692-9117-4953-94a0-420941ce4b7a/extract-utilities/0.log" Dec 01 21:06:56 crc kubenswrapper[4802]: I1201 21:06:56.938528 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wcdzr_4f8205a1-38e0-47e3-be8f-5add6cdac5cb/registry-server/0.log" Dec 01 21:06:57 crc kubenswrapper[4802]: I1201 21:06:57.023269 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fgxlr_038eb692-9117-4953-94a0-420941ce4b7a/extract-utilities/0.log" Dec 01 21:06:57 crc kubenswrapper[4802]: I1201 21:06:57.029712 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fgxlr_038eb692-9117-4953-94a0-420941ce4b7a/extract-content/0.log" Dec 01 21:06:57 crc kubenswrapper[4802]: I1201 21:06:57.089079 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fgxlr_038eb692-9117-4953-94a0-420941ce4b7a/extract-content/0.log" Dec 01 21:06:57 crc kubenswrapper[4802]: I1201 21:06:57.279935 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fgxlr_038eb692-9117-4953-94a0-420941ce4b7a/extract-content/0.log" Dec 01 21:06:57 crc kubenswrapper[4802]: I1201 21:06:57.395485 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fgxlr_038eb692-9117-4953-94a0-420941ce4b7a/registry-server/0.log" Dec 01 21:06:57 crc kubenswrapper[4802]: I1201 21:06:57.406562 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-fgxlr_038eb692-9117-4953-94a0-420941ce4b7a/extract-utilities/0.log" Dec 01 21:06:57 crc kubenswrapper[4802]: I1201 21:06:57.527208 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d5chn_5f1c282e-e2a5-40b2-9496-89e5e37f615b/extract-utilities/0.log" Dec 01 21:06:57 crc kubenswrapper[4802]: I1201 21:06:57.660830 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d5chn_5f1c282e-e2a5-40b2-9496-89e5e37f615b/extract-content/0.log" Dec 01 21:06:57 crc kubenswrapper[4802]: I1201 21:06:57.663767 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d5chn_5f1c282e-e2a5-40b2-9496-89e5e37f615b/extract-utilities/0.log" Dec 01 21:06:57 crc kubenswrapper[4802]: I1201 21:06:57.711407 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d5chn_5f1c282e-e2a5-40b2-9496-89e5e37f615b/extract-content/0.log" Dec 01 21:06:57 crc kubenswrapper[4802]: I1201 21:06:57.854928 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d5chn_5f1c282e-e2a5-40b2-9496-89e5e37f615b/extract-content/0.log" Dec 01 21:06:57 crc kubenswrapper[4802]: I1201 21:06:57.889362 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d5chn_5f1c282e-e2a5-40b2-9496-89e5e37f615b/extract-utilities/0.log" Dec 01 21:06:57 crc kubenswrapper[4802]: I1201 21:06:57.895590 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-d5chn_5f1c282e-e2a5-40b2-9496-89e5e37f615b/registry-server/0.log" Dec 01 21:06:58 crc kubenswrapper[4802]: I1201 21:06:58.008774 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hhc44_3118d07d-4c45-4759-b2a0-792e5f4ca0fc/extract-utilities/0.log" Dec 01 21:06:58 crc kubenswrapper[4802]: I1201 21:06:58.088665 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 21:06:58 crc kubenswrapper[4802]: I1201 21:06:58.088721 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 21:06:58 crc kubenswrapper[4802]: I1201 21:06:58.171530 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hhc44_3118d07d-4c45-4759-b2a0-792e5f4ca0fc/extract-content/0.log" Dec 01 21:06:58 crc kubenswrapper[4802]: I1201 21:06:58.182100 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hhc44_3118d07d-4c45-4759-b2a0-792e5f4ca0fc/extract-utilities/0.log" Dec 01 21:06:58 crc kubenswrapper[4802]: I1201 21:06:58.189247 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hhc44_3118d07d-4c45-4759-b2a0-792e5f4ca0fc/extract-content/0.log" Dec 01 21:06:58 crc kubenswrapper[4802]: I1201 21:06:58.386759 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hhc44_3118d07d-4c45-4759-b2a0-792e5f4ca0fc/extract-content/0.log" Dec 01 21:06:58 crc kubenswrapper[4802]: I1201 21:06:58.410603 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hhc44_3118d07d-4c45-4759-b2a0-792e5f4ca0fc/extract-utilities/0.log" Dec 01 21:06:58 crc kubenswrapper[4802]: I1201 21:06:58.893695 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hhc44_3118d07d-4c45-4759-b2a0-792e5f4ca0fc/registry-server/0.log" Dec 01 21:07:00 crc kubenswrapper[4802]: I1201 21:07:00.895849 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d5chn" Dec 01 21:07:00 crc kubenswrapper[4802]: I1201 21:07:00.896436 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d5chn" Dec 01 21:07:00 crc kubenswrapper[4802]: I1201 21:07:00.954842 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d5chn" Dec 01 21:07:01 crc kubenswrapper[4802]: I1201 21:07:01.066438 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d5chn" Dec 01 21:07:01 crc kubenswrapper[4802]: I1201 21:07:01.193650 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d5chn"] Dec 01 21:07:03 crc kubenswrapper[4802]: I1201 21:07:03.032255 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d5chn" podUID="5f1c282e-e2a5-40b2-9496-89e5e37f615b" containerName="registry-server" containerID="cri-o://4db6bb55f64478a3100d6362e4c23e676b8498bc702e569c71546bb718bc3769" gracePeriod=2 Dec 01 21:07:03 crc kubenswrapper[4802]: I1201 21:07:03.506501 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d5chn" Dec 01 21:07:03 crc kubenswrapper[4802]: I1201 21:07:03.616350 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj4bc\" (UniqueName: \"kubernetes.io/projected/5f1c282e-e2a5-40b2-9496-89e5e37f615b-kube-api-access-sj4bc\") pod \"5f1c282e-e2a5-40b2-9496-89e5e37f615b\" (UID: \"5f1c282e-e2a5-40b2-9496-89e5e37f615b\") " Dec 01 21:07:03 crc kubenswrapper[4802]: I1201 21:07:03.616487 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f1c282e-e2a5-40b2-9496-89e5e37f615b-utilities\") pod \"5f1c282e-e2a5-40b2-9496-89e5e37f615b\" (UID: \"5f1c282e-e2a5-40b2-9496-89e5e37f615b\") " Dec 01 21:07:03 crc kubenswrapper[4802]: I1201 21:07:03.616645 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f1c282e-e2a5-40b2-9496-89e5e37f615b-catalog-content\") pod \"5f1c282e-e2a5-40b2-9496-89e5e37f615b\" (UID: \"5f1c282e-e2a5-40b2-9496-89e5e37f615b\") " Dec 01 21:07:03 crc kubenswrapper[4802]: I1201 21:07:03.617442 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f1c282e-e2a5-40b2-9496-89e5e37f615b-utilities" (OuterVolumeSpecName: "utilities") pod "5f1c282e-e2a5-40b2-9496-89e5e37f615b" (UID: "5f1c282e-e2a5-40b2-9496-89e5e37f615b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:07:03 crc kubenswrapper[4802]: I1201 21:07:03.618860 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f1c282e-e2a5-40b2-9496-89e5e37f615b-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 21:07:03 crc kubenswrapper[4802]: I1201 21:07:03.622119 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f1c282e-e2a5-40b2-9496-89e5e37f615b-kube-api-access-sj4bc" (OuterVolumeSpecName: "kube-api-access-sj4bc") pod "5f1c282e-e2a5-40b2-9496-89e5e37f615b" (UID: "5f1c282e-e2a5-40b2-9496-89e5e37f615b"). InnerVolumeSpecName "kube-api-access-sj4bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:07:03 crc kubenswrapper[4802]: I1201 21:07:03.720910 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj4bc\" (UniqueName: \"kubernetes.io/projected/5f1c282e-e2a5-40b2-9496-89e5e37f615b-kube-api-access-sj4bc\") on node \"crc\" DevicePath \"\"" Dec 01 21:07:03 crc kubenswrapper[4802]: I1201 21:07:03.746773 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f1c282e-e2a5-40b2-9496-89e5e37f615b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5f1c282e-e2a5-40b2-9496-89e5e37f615b" (UID: "5f1c282e-e2a5-40b2-9496-89e5e37f615b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:07:03 crc kubenswrapper[4802]: I1201 21:07:03.822950 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f1c282e-e2a5-40b2-9496-89e5e37f615b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 21:07:04 crc kubenswrapper[4802]: I1201 21:07:04.044774 4802 generic.go:334] "Generic (PLEG): container finished" podID="5f1c282e-e2a5-40b2-9496-89e5e37f615b" containerID="4db6bb55f64478a3100d6362e4c23e676b8498bc702e569c71546bb718bc3769" exitCode=0 Dec 01 21:07:04 crc kubenswrapper[4802]: I1201 21:07:04.044819 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d5chn" event={"ID":"5f1c282e-e2a5-40b2-9496-89e5e37f615b","Type":"ContainerDied","Data":"4db6bb55f64478a3100d6362e4c23e676b8498bc702e569c71546bb718bc3769"} Dec 01 21:07:04 crc kubenswrapper[4802]: I1201 21:07:04.044848 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d5chn" event={"ID":"5f1c282e-e2a5-40b2-9496-89e5e37f615b","Type":"ContainerDied","Data":"bc034d159a2f45c30342a55edb62cb192e2acb59d82fc337ba9a1a878a456cb4"} Dec 01 21:07:04 crc kubenswrapper[4802]: I1201 21:07:04.044866 4802 scope.go:117] "RemoveContainer" containerID="4db6bb55f64478a3100d6362e4c23e676b8498bc702e569c71546bb718bc3769" Dec 01 21:07:04 crc kubenswrapper[4802]: I1201 21:07:04.044910 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d5chn" Dec 01 21:07:04 crc kubenswrapper[4802]: I1201 21:07:04.063604 4802 scope.go:117] "RemoveContainer" containerID="5cdfdb1fb14dd0f502b8b6b82058389c8b928bf9cd85d2d3cc77e77f838d5861" Dec 01 21:07:04 crc kubenswrapper[4802]: I1201 21:07:04.095470 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d5chn"] Dec 01 21:07:04 crc kubenswrapper[4802]: I1201 21:07:04.105436 4802 scope.go:117] "RemoveContainer" containerID="a8e3b622419a21326afbdb81c2ad2c42d32a9aa5d8cc2f96c5e5821bd4f76d62" Dec 01 21:07:04 crc kubenswrapper[4802]: I1201 21:07:04.113685 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d5chn"] Dec 01 21:07:04 crc kubenswrapper[4802]: I1201 21:07:04.139056 4802 scope.go:117] "RemoveContainer" containerID="4db6bb55f64478a3100d6362e4c23e676b8498bc702e569c71546bb718bc3769" Dec 01 21:07:04 crc kubenswrapper[4802]: E1201 21:07:04.139541 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4db6bb55f64478a3100d6362e4c23e676b8498bc702e569c71546bb718bc3769\": container with ID starting with 4db6bb55f64478a3100d6362e4c23e676b8498bc702e569c71546bb718bc3769 not found: ID does not exist" containerID="4db6bb55f64478a3100d6362e4c23e676b8498bc702e569c71546bb718bc3769" Dec 01 21:07:04 crc kubenswrapper[4802]: I1201 21:07:04.139578 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4db6bb55f64478a3100d6362e4c23e676b8498bc702e569c71546bb718bc3769"} err="failed to get container status \"4db6bb55f64478a3100d6362e4c23e676b8498bc702e569c71546bb718bc3769\": rpc error: code = NotFound desc = could not find container \"4db6bb55f64478a3100d6362e4c23e676b8498bc702e569c71546bb718bc3769\": container with ID starting with 4db6bb55f64478a3100d6362e4c23e676b8498bc702e569c71546bb718bc3769 not found: ID does not exist" Dec 01 21:07:04 crc kubenswrapper[4802]: I1201 21:07:04.139607 4802 scope.go:117] "RemoveContainer" containerID="5cdfdb1fb14dd0f502b8b6b82058389c8b928bf9cd85d2d3cc77e77f838d5861" Dec 01 21:07:04 crc kubenswrapper[4802]: E1201 21:07:04.140077 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cdfdb1fb14dd0f502b8b6b82058389c8b928bf9cd85d2d3cc77e77f838d5861\": container with ID starting with 5cdfdb1fb14dd0f502b8b6b82058389c8b928bf9cd85d2d3cc77e77f838d5861 not found: ID does not exist" containerID="5cdfdb1fb14dd0f502b8b6b82058389c8b928bf9cd85d2d3cc77e77f838d5861" Dec 01 21:07:04 crc kubenswrapper[4802]: I1201 21:07:04.140140 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cdfdb1fb14dd0f502b8b6b82058389c8b928bf9cd85d2d3cc77e77f838d5861"} err="failed to get container status \"5cdfdb1fb14dd0f502b8b6b82058389c8b928bf9cd85d2d3cc77e77f838d5861\": rpc error: code = NotFound desc = could not find container \"5cdfdb1fb14dd0f502b8b6b82058389c8b928bf9cd85d2d3cc77e77f838d5861\": container with ID starting with 5cdfdb1fb14dd0f502b8b6b82058389c8b928bf9cd85d2d3cc77e77f838d5861 not found: ID does not exist" Dec 01 21:07:04 crc kubenswrapper[4802]: I1201 21:07:04.140187 4802 scope.go:117] "RemoveContainer" containerID="a8e3b622419a21326afbdb81c2ad2c42d32a9aa5d8cc2f96c5e5821bd4f76d62" Dec 01 21:07:04 crc kubenswrapper[4802]: E1201 21:07:04.140770 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8e3b622419a21326afbdb81c2ad2c42d32a9aa5d8cc2f96c5e5821bd4f76d62\": container with ID starting with a8e3b622419a21326afbdb81c2ad2c42d32a9aa5d8cc2f96c5e5821bd4f76d62 not found: ID does not exist" containerID="a8e3b622419a21326afbdb81c2ad2c42d32a9aa5d8cc2f96c5e5821bd4f76d62" Dec 01 21:07:04 crc kubenswrapper[4802]: I1201 21:07:04.140807 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8e3b622419a21326afbdb81c2ad2c42d32a9aa5d8cc2f96c5e5821bd4f76d62"} err="failed to get container status \"a8e3b622419a21326afbdb81c2ad2c42d32a9aa5d8cc2f96c5e5821bd4f76d62\": rpc error: code = NotFound desc = could not find container \"a8e3b622419a21326afbdb81c2ad2c42d32a9aa5d8cc2f96c5e5821bd4f76d62\": container with ID starting with a8e3b622419a21326afbdb81c2ad2c42d32a9aa5d8cc2f96c5e5821bd4f76d62 not found: ID does not exist" Dec 01 21:07:04 crc kubenswrapper[4802]: I1201 21:07:04.731523 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f1c282e-e2a5-40b2-9496-89e5e37f615b" path="/var/lib/kubelet/pods/5f1c282e-e2a5-40b2-9496-89e5e37f615b/volumes" Dec 01 21:07:28 crc kubenswrapper[4802]: I1201 21:07:28.089053 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 21:07:28 crc kubenswrapper[4802]: I1201 21:07:28.089686 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 21:07:58 crc kubenswrapper[4802]: I1201 21:07:58.088437 4802 patch_prober.go:28] interesting pod/machine-config-daemon-tw4xd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 21:07:58 crc kubenswrapper[4802]: I1201 21:07:58.089856 4802 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 21:07:58 crc kubenswrapper[4802]: I1201 21:07:58.089967 4802 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" Dec 01 21:07:58 crc kubenswrapper[4802]: I1201 21:07:58.090786 4802 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8c9005f4cf58b22fc1828c37547de2dc97ee764ad4fa168137785ace7fe8bed0"} pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 21:07:58 crc kubenswrapper[4802]: I1201 21:07:58.090919 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerName="machine-config-daemon" containerID="cri-o://8c9005f4cf58b22fc1828c37547de2dc97ee764ad4fa168137785ace7fe8bed0" gracePeriod=600 Dec 01 21:07:58 crc kubenswrapper[4802]: E1201 21:07:58.215104 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 21:07:58 crc kubenswrapper[4802]: I1201 21:07:58.580387 4802 generic.go:334] "Generic (PLEG): container finished" podID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" containerID="8c9005f4cf58b22fc1828c37547de2dc97ee764ad4fa168137785ace7fe8bed0" exitCode=0 Dec 01 21:07:58 crc kubenswrapper[4802]: I1201 21:07:58.580455 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" event={"ID":"23e1ef99-f507-42ea-a076-4fc1681c7e8c","Type":"ContainerDied","Data":"8c9005f4cf58b22fc1828c37547de2dc97ee764ad4fa168137785ace7fe8bed0"} Dec 01 21:07:58 crc kubenswrapper[4802]: I1201 21:07:58.580775 4802 scope.go:117] "RemoveContainer" containerID="9d25c80dbaea60b5c692e1f1632f7e7ea1d154c74635474abccc9cbdc7b629eb" Dec 01 21:07:58 crc kubenswrapper[4802]: I1201 21:07:58.581525 4802 scope.go:117] "RemoveContainer" containerID="8c9005f4cf58b22fc1828c37547de2dc97ee764ad4fa168137785ace7fe8bed0" Dec 01 21:07:58 crc kubenswrapper[4802]: E1201 21:07:58.582170 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 21:08:13 crc kubenswrapper[4802]: I1201 21:08:13.720707 4802 scope.go:117] "RemoveContainer" containerID="8c9005f4cf58b22fc1828c37547de2dc97ee764ad4fa168137785ace7fe8bed0" Dec 01 21:08:13 crc kubenswrapper[4802]: E1201 21:08:13.721899 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 21:08:26 crc kubenswrapper[4802]: I1201 21:08:26.720569 4802 scope.go:117] "RemoveContainer" containerID="8c9005f4cf58b22fc1828c37547de2dc97ee764ad4fa168137785ace7fe8bed0" Dec 01 21:08:26 crc kubenswrapper[4802]: E1201 21:08:26.721605 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 21:08:37 crc kubenswrapper[4802]: I1201 21:08:37.721673 4802 scope.go:117] "RemoveContainer" containerID="8c9005f4cf58b22fc1828c37547de2dc97ee764ad4fa168137785ace7fe8bed0" Dec 01 21:08:37 crc kubenswrapper[4802]: E1201 21:08:37.723065 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 21:08:39 crc kubenswrapper[4802]: I1201 21:08:39.034789 4802 generic.go:334] "Generic (PLEG): container finished" podID="c03def70-5ce8-4c92-90f3-e6a80af5f461" containerID="c5fd9001c57456ca9d1f323369661b9030f0b526718f331020e27c6b6fa25e49" exitCode=0 Dec 01 21:08:39 crc kubenswrapper[4802]: I1201 21:08:39.034948 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58wgl/must-gather-fz6ld" event={"ID":"c03def70-5ce8-4c92-90f3-e6a80af5f461","Type":"ContainerDied","Data":"c5fd9001c57456ca9d1f323369661b9030f0b526718f331020e27c6b6fa25e49"} Dec 01 21:08:39 crc kubenswrapper[4802]: I1201 21:08:39.037019 4802 scope.go:117] "RemoveContainer" containerID="c5fd9001c57456ca9d1f323369661b9030f0b526718f331020e27c6b6fa25e49" Dec 01 21:08:39 crc kubenswrapper[4802]: I1201 21:08:39.753853 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-58wgl_must-gather-fz6ld_c03def70-5ce8-4c92-90f3-e6a80af5f461/gather/0.log" Dec 01 21:08:48 crc kubenswrapper[4802]: I1201 21:08:48.725732 4802 scope.go:117] "RemoveContainer" containerID="8c9005f4cf58b22fc1828c37547de2dc97ee764ad4fa168137785ace7fe8bed0" Dec 01 21:08:48 crc kubenswrapper[4802]: E1201 21:08:48.726510 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 21:08:49 crc kubenswrapper[4802]: I1201 21:08:49.571376 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-58wgl/must-gather-fz6ld"] Dec 01 21:08:49 crc kubenswrapper[4802]: I1201 21:08:49.571955 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-58wgl/must-gather-fz6ld" podUID="c03def70-5ce8-4c92-90f3-e6a80af5f461" containerName="copy" containerID="cri-o://bea61adfff86fd9ba522158a42990ea30c92b65b697a2fda41be1f18b4520724" gracePeriod=2 Dec 01 21:08:49 crc kubenswrapper[4802]: I1201 21:08:49.580163 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-58wgl/must-gather-fz6ld"] Dec 01 21:08:49 crc kubenswrapper[4802]: I1201 21:08:49.980841 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-58wgl_must-gather-fz6ld_c03def70-5ce8-4c92-90f3-e6a80af5f461/copy/0.log" Dec 01 21:08:49 crc kubenswrapper[4802]: I1201 21:08:49.981993 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58wgl/must-gather-fz6ld" Dec 01 21:08:50 crc kubenswrapper[4802]: I1201 21:08:50.135051 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzb7b\" (UniqueName: \"kubernetes.io/projected/c03def70-5ce8-4c92-90f3-e6a80af5f461-kube-api-access-wzb7b\") pod \"c03def70-5ce8-4c92-90f3-e6a80af5f461\" (UID: \"c03def70-5ce8-4c92-90f3-e6a80af5f461\") " Dec 01 21:08:50 crc kubenswrapper[4802]: I1201 21:08:50.135105 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c03def70-5ce8-4c92-90f3-e6a80af5f461-must-gather-output\") pod \"c03def70-5ce8-4c92-90f3-e6a80af5f461\" (UID: \"c03def70-5ce8-4c92-90f3-e6a80af5f461\") " Dec 01 21:08:50 crc kubenswrapper[4802]: I1201 21:08:50.143520 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03def70-5ce8-4c92-90f3-e6a80af5f461-kube-api-access-wzb7b" (OuterVolumeSpecName: "kube-api-access-wzb7b") pod "c03def70-5ce8-4c92-90f3-e6a80af5f461" (UID: "c03def70-5ce8-4c92-90f3-e6a80af5f461"). InnerVolumeSpecName "kube-api-access-wzb7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:08:50 crc kubenswrapper[4802]: I1201 21:08:50.156533 4802 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-58wgl_must-gather-fz6ld_c03def70-5ce8-4c92-90f3-e6a80af5f461/copy/0.log" Dec 01 21:08:50 crc kubenswrapper[4802]: I1201 21:08:50.160443 4802 generic.go:334] "Generic (PLEG): container finished" podID="c03def70-5ce8-4c92-90f3-e6a80af5f461" containerID="bea61adfff86fd9ba522158a42990ea30c92b65b697a2fda41be1f18b4520724" exitCode=143 Dec 01 21:08:50 crc kubenswrapper[4802]: I1201 21:08:50.160507 4802 scope.go:117] "RemoveContainer" containerID="bea61adfff86fd9ba522158a42990ea30c92b65b697a2fda41be1f18b4520724" Dec 01 21:08:50 crc kubenswrapper[4802]: I1201 21:08:50.160663 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58wgl/must-gather-fz6ld" Dec 01 21:08:50 crc kubenswrapper[4802]: I1201 21:08:50.205779 4802 scope.go:117] "RemoveContainer" containerID="c5fd9001c57456ca9d1f323369661b9030f0b526718f331020e27c6b6fa25e49" Dec 01 21:08:50 crc kubenswrapper[4802]: I1201 21:08:50.237526 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzb7b\" (UniqueName: \"kubernetes.io/projected/c03def70-5ce8-4c92-90f3-e6a80af5f461-kube-api-access-wzb7b\") on node \"crc\" DevicePath \"\"" Dec 01 21:08:50 crc kubenswrapper[4802]: I1201 21:08:50.248714 4802 scope.go:117] "RemoveContainer" containerID="bea61adfff86fd9ba522158a42990ea30c92b65b697a2fda41be1f18b4520724" Dec 01 21:08:50 crc kubenswrapper[4802]: E1201 21:08:50.249084 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bea61adfff86fd9ba522158a42990ea30c92b65b697a2fda41be1f18b4520724\": container with ID starting with bea61adfff86fd9ba522158a42990ea30c92b65b697a2fda41be1f18b4520724 not found: ID does not exist" containerID="bea61adfff86fd9ba522158a42990ea30c92b65b697a2fda41be1f18b4520724" Dec 01 21:08:50 crc kubenswrapper[4802]: I1201 21:08:50.249184 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bea61adfff86fd9ba522158a42990ea30c92b65b697a2fda41be1f18b4520724"} err="failed to get container status \"bea61adfff86fd9ba522158a42990ea30c92b65b697a2fda41be1f18b4520724\": rpc error: code = NotFound desc = could not find container \"bea61adfff86fd9ba522158a42990ea30c92b65b697a2fda41be1f18b4520724\": container with ID starting with bea61adfff86fd9ba522158a42990ea30c92b65b697a2fda41be1f18b4520724 not found: ID does not exist" Dec 01 21:08:50 crc kubenswrapper[4802]: I1201 21:08:50.249289 4802 scope.go:117] "RemoveContainer" containerID="c5fd9001c57456ca9d1f323369661b9030f0b526718f331020e27c6b6fa25e49" Dec 01 21:08:50 crc kubenswrapper[4802]: E1201 21:08:50.249733 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5fd9001c57456ca9d1f323369661b9030f0b526718f331020e27c6b6fa25e49\": container with ID starting with c5fd9001c57456ca9d1f323369661b9030f0b526718f331020e27c6b6fa25e49 not found: ID does not exist" containerID="c5fd9001c57456ca9d1f323369661b9030f0b526718f331020e27c6b6fa25e49" Dec 01 21:08:50 crc kubenswrapper[4802]: I1201 21:08:50.249832 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5fd9001c57456ca9d1f323369661b9030f0b526718f331020e27c6b6fa25e49"} err="failed to get container status \"c5fd9001c57456ca9d1f323369661b9030f0b526718f331020e27c6b6fa25e49\": rpc error: code = NotFound desc = could not find container \"c5fd9001c57456ca9d1f323369661b9030f0b526718f331020e27c6b6fa25e49\": container with ID starting with c5fd9001c57456ca9d1f323369661b9030f0b526718f331020e27c6b6fa25e49 not found: ID does not exist" Dec 01 21:08:50 crc kubenswrapper[4802]: I1201 21:08:50.341627 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c03def70-5ce8-4c92-90f3-e6a80af5f461-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c03def70-5ce8-4c92-90f3-e6a80af5f461" (UID: "c03def70-5ce8-4c92-90f3-e6a80af5f461"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:08:50 crc kubenswrapper[4802]: I1201 21:08:50.441424 4802 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c03def70-5ce8-4c92-90f3-e6a80af5f461-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 01 21:08:50 crc kubenswrapper[4802]: I1201 21:08:50.731075 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03def70-5ce8-4c92-90f3-e6a80af5f461" path="/var/lib/kubelet/pods/c03def70-5ce8-4c92-90f3-e6a80af5f461/volumes" Dec 01 21:08:59 crc kubenswrapper[4802]: I1201 21:08:59.720002 4802 scope.go:117] "RemoveContainer" containerID="8c9005f4cf58b22fc1828c37547de2dc97ee764ad4fa168137785ace7fe8bed0" Dec 01 21:08:59 crc kubenswrapper[4802]: E1201 21:08:59.720909 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 21:09:11 crc kubenswrapper[4802]: I1201 21:09:11.720536 4802 scope.go:117] "RemoveContainer" containerID="8c9005f4cf58b22fc1828c37547de2dc97ee764ad4fa168137785ace7fe8bed0" Dec 01 21:09:11 crc kubenswrapper[4802]: E1201 21:09:11.721231 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 21:09:24 crc kubenswrapper[4802]: I1201 21:09:24.722098 4802 scope.go:117] "RemoveContainer" containerID="8c9005f4cf58b22fc1828c37547de2dc97ee764ad4fa168137785ace7fe8bed0" Dec 01 21:09:24 crc kubenswrapper[4802]: E1201 21:09:24.729835 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 21:09:29 crc kubenswrapper[4802]: I1201 21:09:29.338951 4802 scope.go:117] "RemoveContainer" containerID="ee68fc32f65e79ddbc20579149624bee16f2a66201673d9b086da3928054a16d" Dec 01 21:09:37 crc kubenswrapper[4802]: I1201 21:09:37.720691 4802 scope.go:117] "RemoveContainer" containerID="8c9005f4cf58b22fc1828c37547de2dc97ee764ad4fa168137785ace7fe8bed0" Dec 01 21:09:37 crc kubenswrapper[4802]: E1201 21:09:37.721713 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 21:09:49 crc kubenswrapper[4802]: I1201 21:09:49.720304 4802 scope.go:117] "RemoveContainer" containerID="8c9005f4cf58b22fc1828c37547de2dc97ee764ad4fa168137785ace7fe8bed0" Dec 01 21:09:49 crc kubenswrapper[4802]: E1201 21:09:49.721129 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 21:10:01 crc kubenswrapper[4802]: I1201 21:10:01.719994 4802 scope.go:117] "RemoveContainer" containerID="8c9005f4cf58b22fc1828c37547de2dc97ee764ad4fa168137785ace7fe8bed0" Dec 01 21:10:01 crc kubenswrapper[4802]: E1201 21:10:01.720960 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 21:10:12 crc kubenswrapper[4802]: I1201 21:10:12.720586 4802 scope.go:117] "RemoveContainer" containerID="8c9005f4cf58b22fc1828c37547de2dc97ee764ad4fa168137785ace7fe8bed0" Dec 01 21:10:12 crc kubenswrapper[4802]: E1201 21:10:12.721533 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 21:10:26 crc kubenswrapper[4802]: I1201 21:10:26.720884 4802 scope.go:117] "RemoveContainer" containerID="8c9005f4cf58b22fc1828c37547de2dc97ee764ad4fa168137785ace7fe8bed0" Dec 01 21:10:26 crc kubenswrapper[4802]: E1201 21:10:26.721771 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 21:10:29 crc kubenswrapper[4802]: I1201 21:10:29.427387 4802 scope.go:117] "RemoveContainer" containerID="886fecb64b19c7e447e305c1ad337e2c326f6a0ef16a7ae748bcc371b6849fea" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.242910 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f58p7"] Dec 01 21:10:33 crc kubenswrapper[4802]: E1201 21:10:33.243993 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f1c282e-e2a5-40b2-9496-89e5e37f615b" containerName="extract-utilities" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.244011 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f1c282e-e2a5-40b2-9496-89e5e37f615b" containerName="extract-utilities" Dec 01 21:10:33 crc kubenswrapper[4802]: E1201 21:10:33.244026 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c03def70-5ce8-4c92-90f3-e6a80af5f461" containerName="copy" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.244033 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03def70-5ce8-4c92-90f3-e6a80af5f461" containerName="copy" Dec 01 21:10:33 crc kubenswrapper[4802]: E1201 21:10:33.244043 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c03def70-5ce8-4c92-90f3-e6a80af5f461" containerName="gather" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.244051 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03def70-5ce8-4c92-90f3-e6a80af5f461" containerName="gather" Dec 01 21:10:33 crc kubenswrapper[4802]: E1201 21:10:33.244067 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f1c282e-e2a5-40b2-9496-89e5e37f615b" containerName="extract-content" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.244077 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f1c282e-e2a5-40b2-9496-89e5e37f615b" containerName="extract-content" Dec 01 21:10:33 crc kubenswrapper[4802]: E1201 21:10:33.244098 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f1c282e-e2a5-40b2-9496-89e5e37f615b" containerName="registry-server" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.244107 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f1c282e-e2a5-40b2-9496-89e5e37f615b" containerName="registry-server" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.244359 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="c03def70-5ce8-4c92-90f3-e6a80af5f461" containerName="gather" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.244383 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f1c282e-e2a5-40b2-9496-89e5e37f615b" containerName="registry-server" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.244401 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="c03def70-5ce8-4c92-90f3-e6a80af5f461" containerName="copy" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.246230 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f58p7" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.253584 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f58p7"] Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.257014 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89r7t\" (UniqueName: \"kubernetes.io/projected/91848b70-1bb1-4b7c-8996-754320f6e26d-kube-api-access-89r7t\") pod \"certified-operators-f58p7\" (UID: \"91848b70-1bb1-4b7c-8996-754320f6e26d\") " pod="openshift-marketplace/certified-operators-f58p7" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.257079 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91848b70-1bb1-4b7c-8996-754320f6e26d-utilities\") pod \"certified-operators-f58p7\" (UID: \"91848b70-1bb1-4b7c-8996-754320f6e26d\") " pod="openshift-marketplace/certified-operators-f58p7" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.257116 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91848b70-1bb1-4b7c-8996-754320f6e26d-catalog-content\") pod \"certified-operators-f58p7\" (UID: \"91848b70-1bb1-4b7c-8996-754320f6e26d\") " pod="openshift-marketplace/certified-operators-f58p7" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.358974 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89r7t\" (UniqueName: \"kubernetes.io/projected/91848b70-1bb1-4b7c-8996-754320f6e26d-kube-api-access-89r7t\") pod \"certified-operators-f58p7\" (UID: \"91848b70-1bb1-4b7c-8996-754320f6e26d\") " pod="openshift-marketplace/certified-operators-f58p7" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.359341 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91848b70-1bb1-4b7c-8996-754320f6e26d-utilities\") pod \"certified-operators-f58p7\" (UID: \"91848b70-1bb1-4b7c-8996-754320f6e26d\") " pod="openshift-marketplace/certified-operators-f58p7" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.359373 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91848b70-1bb1-4b7c-8996-754320f6e26d-catalog-content\") pod \"certified-operators-f58p7\" (UID: \"91848b70-1bb1-4b7c-8996-754320f6e26d\") " pod="openshift-marketplace/certified-operators-f58p7" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.360138 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91848b70-1bb1-4b7c-8996-754320f6e26d-utilities\") pod \"certified-operators-f58p7\" (UID: \"91848b70-1bb1-4b7c-8996-754320f6e26d\") " pod="openshift-marketplace/certified-operators-f58p7" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.360173 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91848b70-1bb1-4b7c-8996-754320f6e26d-catalog-content\") pod \"certified-operators-f58p7\" (UID: \"91848b70-1bb1-4b7c-8996-754320f6e26d\") " pod="openshift-marketplace/certified-operators-f58p7" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.382445 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89r7t\" (UniqueName: \"kubernetes.io/projected/91848b70-1bb1-4b7c-8996-754320f6e26d-kube-api-access-89r7t\") pod \"certified-operators-f58p7\" (UID: \"91848b70-1bb1-4b7c-8996-754320f6e26d\") " pod="openshift-marketplace/certified-operators-f58p7" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.444040 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xg76d"] Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.445924 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xg76d" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.456799 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xg76d"] Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.565225 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5fc5\" (UniqueName: \"kubernetes.io/projected/ee8ee239-57ba-4dd4-8832-da2bbb5cdf69-kube-api-access-f5fc5\") pod \"community-operators-xg76d\" (UID: \"ee8ee239-57ba-4dd4-8832-da2bbb5cdf69\") " pod="openshift-marketplace/community-operators-xg76d" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.565351 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee8ee239-57ba-4dd4-8832-da2bbb5cdf69-utilities\") pod \"community-operators-xg76d\" (UID: \"ee8ee239-57ba-4dd4-8832-da2bbb5cdf69\") " pod="openshift-marketplace/community-operators-xg76d" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.565442 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee8ee239-57ba-4dd4-8832-da2bbb5cdf69-catalog-content\") pod \"community-operators-xg76d\" (UID: \"ee8ee239-57ba-4dd4-8832-da2bbb5cdf69\") " pod="openshift-marketplace/community-operators-xg76d" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.578751 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f58p7" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.666615 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee8ee239-57ba-4dd4-8832-da2bbb5cdf69-utilities\") pod \"community-operators-xg76d\" (UID: \"ee8ee239-57ba-4dd4-8832-da2bbb5cdf69\") " pod="openshift-marketplace/community-operators-xg76d" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.666701 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee8ee239-57ba-4dd4-8832-da2bbb5cdf69-catalog-content\") pod \"community-operators-xg76d\" (UID: \"ee8ee239-57ba-4dd4-8832-da2bbb5cdf69\") " pod="openshift-marketplace/community-operators-xg76d" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.666784 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5fc5\" (UniqueName: \"kubernetes.io/projected/ee8ee239-57ba-4dd4-8832-da2bbb5cdf69-kube-api-access-f5fc5\") pod \"community-operators-xg76d\" (UID: \"ee8ee239-57ba-4dd4-8832-da2bbb5cdf69\") " pod="openshift-marketplace/community-operators-xg76d" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.667276 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee8ee239-57ba-4dd4-8832-da2bbb5cdf69-catalog-content\") pod \"community-operators-xg76d\" (UID: \"ee8ee239-57ba-4dd4-8832-da2bbb5cdf69\") " pod="openshift-marketplace/community-operators-xg76d" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.667188 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee8ee239-57ba-4dd4-8832-da2bbb5cdf69-utilities\") pod \"community-operators-xg76d\" (UID: \"ee8ee239-57ba-4dd4-8832-da2bbb5cdf69\") " pod="openshift-marketplace/community-operators-xg76d" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.688325 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5fc5\" (UniqueName: \"kubernetes.io/projected/ee8ee239-57ba-4dd4-8832-da2bbb5cdf69-kube-api-access-f5fc5\") pod \"community-operators-xg76d\" (UID: \"ee8ee239-57ba-4dd4-8832-da2bbb5cdf69\") " pod="openshift-marketplace/community-operators-xg76d" Dec 01 21:10:33 crc kubenswrapper[4802]: I1201 21:10:33.770430 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xg76d" Dec 01 21:10:34 crc kubenswrapper[4802]: I1201 21:10:34.021921 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f58p7"] Dec 01 21:10:34 crc kubenswrapper[4802]: I1201 21:10:34.264799 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xg76d"] Dec 01 21:10:34 crc kubenswrapper[4802]: W1201 21:10:34.282283 4802 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee8ee239_57ba_4dd4_8832_da2bbb5cdf69.slice/crio-563dfa04e1a8ba957c9ecdec50ed1638d373ce0a52443fabed5ab861fb5e1e53 WatchSource:0}: Error finding container 563dfa04e1a8ba957c9ecdec50ed1638d373ce0a52443fabed5ab861fb5e1e53: Status 404 returned error can't find the container with id 563dfa04e1a8ba957c9ecdec50ed1638d373ce0a52443fabed5ab861fb5e1e53 Dec 01 21:10:34 crc kubenswrapper[4802]: I1201 21:10:34.334519 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xg76d" event={"ID":"ee8ee239-57ba-4dd4-8832-da2bbb5cdf69","Type":"ContainerStarted","Data":"563dfa04e1a8ba957c9ecdec50ed1638d373ce0a52443fabed5ab861fb5e1e53"} Dec 01 21:10:34 crc kubenswrapper[4802]: I1201 21:10:34.338987 4802 generic.go:334] "Generic (PLEG): container finished" podID="91848b70-1bb1-4b7c-8996-754320f6e26d" containerID="6deebd58b018e335f6a7db1e3b0b2ecaaae93a7fd98e13cd6f69d223c4a1b5b8" exitCode=0 Dec 01 21:10:34 crc kubenswrapper[4802]: I1201 21:10:34.339040 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f58p7" event={"ID":"91848b70-1bb1-4b7c-8996-754320f6e26d","Type":"ContainerDied","Data":"6deebd58b018e335f6a7db1e3b0b2ecaaae93a7fd98e13cd6f69d223c4a1b5b8"} Dec 01 21:10:34 crc kubenswrapper[4802]: I1201 21:10:34.339072 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f58p7" event={"ID":"91848b70-1bb1-4b7c-8996-754320f6e26d","Type":"ContainerStarted","Data":"ba808bbdd7530989143ddb77231699c9fc8ce13e09be9c800cf78ab45e0ecb65"} Dec 01 21:10:35 crc kubenswrapper[4802]: I1201 21:10:35.353342 4802 generic.go:334] "Generic (PLEG): container finished" podID="ee8ee239-57ba-4dd4-8832-da2bbb5cdf69" containerID="de4466b05d0f42afbf25a488510854348ba34f5c5b17d4e37d46bf4a369edb35" exitCode=0 Dec 01 21:10:35 crc kubenswrapper[4802]: I1201 21:10:35.353419 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xg76d" event={"ID":"ee8ee239-57ba-4dd4-8832-da2bbb5cdf69","Type":"ContainerDied","Data":"de4466b05d0f42afbf25a488510854348ba34f5c5b17d4e37d46bf4a369edb35"} Dec 01 21:10:36 crc kubenswrapper[4802]: I1201 21:10:36.367679 4802 generic.go:334] "Generic (PLEG): container finished" podID="91848b70-1bb1-4b7c-8996-754320f6e26d" containerID="e58e03c1c2746f74c1f341f8703a74b15d593cd677f2b6995a5440c06d11cf3a" exitCode=0 Dec 01 21:10:36 crc kubenswrapper[4802]: I1201 21:10:36.367757 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f58p7" event={"ID":"91848b70-1bb1-4b7c-8996-754320f6e26d","Type":"ContainerDied","Data":"e58e03c1c2746f74c1f341f8703a74b15d593cd677f2b6995a5440c06d11cf3a"} Dec 01 21:10:37 crc kubenswrapper[4802]: I1201 21:10:37.379821 4802 generic.go:334] "Generic (PLEG): container finished" podID="ee8ee239-57ba-4dd4-8832-da2bbb5cdf69" containerID="87c90ff0170ffb4ca6191b159e1779ebf6ea7a646cbf73bcb53f452ee4c8e2c2" exitCode=0 Dec 01 21:10:37 crc kubenswrapper[4802]: I1201 21:10:37.379910 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xg76d" event={"ID":"ee8ee239-57ba-4dd4-8832-da2bbb5cdf69","Type":"ContainerDied","Data":"87c90ff0170ffb4ca6191b159e1779ebf6ea7a646cbf73bcb53f452ee4c8e2c2"} Dec 01 21:10:37 crc kubenswrapper[4802]: I1201 21:10:37.389125 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f58p7" event={"ID":"91848b70-1bb1-4b7c-8996-754320f6e26d","Type":"ContainerStarted","Data":"113f8d87e649b350f60d53456b610b6f9b0380f18dcfa40c5588337cdeeefa3e"} Dec 01 21:10:37 crc kubenswrapper[4802]: I1201 21:10:37.429584 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f58p7" podStartSLOduration=1.970094734 podStartE2EDuration="4.429562952s" podCreationTimestamp="2025-12-01 21:10:33 +0000 UTC" firstStartedPulling="2025-12-01 21:10:34.341105792 +0000 UTC m=+4455.903665433" lastFinishedPulling="2025-12-01 21:10:36.80057399 +0000 UTC m=+4458.363133651" observedRunningTime="2025-12-01 21:10:37.428272022 +0000 UTC m=+4458.990831683" watchObservedRunningTime="2025-12-01 21:10:37.429562952 +0000 UTC m=+4458.992122593" Dec 01 21:10:38 crc kubenswrapper[4802]: I1201 21:10:38.399871 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xg76d" event={"ID":"ee8ee239-57ba-4dd4-8832-da2bbb5cdf69","Type":"ContainerStarted","Data":"654a00e8346b2257e4e6b6210245ca45dbf62d4ae84625e290a74251ae6f0a6b"} Dec 01 21:10:38 crc kubenswrapper[4802]: I1201 21:10:38.428453 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xg76d" podStartSLOduration=2.847102654 podStartE2EDuration="5.42843542s" podCreationTimestamp="2025-12-01 21:10:33 +0000 UTC" firstStartedPulling="2025-12-01 21:10:35.354757382 +0000 UTC m=+4456.917317023" lastFinishedPulling="2025-12-01 21:10:37.936090138 +0000 UTC m=+4459.498649789" observedRunningTime="2025-12-01 21:10:38.42553476 +0000 UTC m=+4459.988094411" watchObservedRunningTime="2025-12-01 21:10:38.42843542 +0000 UTC m=+4459.990995061" Dec 01 21:10:39 crc kubenswrapper[4802]: I1201 21:10:39.720852 4802 scope.go:117] "RemoveContainer" containerID="8c9005f4cf58b22fc1828c37547de2dc97ee764ad4fa168137785ace7fe8bed0" Dec 01 21:10:39 crc kubenswrapper[4802]: E1201 21:10:39.721588 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 21:10:43 crc kubenswrapper[4802]: I1201 21:10:43.579605 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f58p7" Dec 01 21:10:43 crc kubenswrapper[4802]: I1201 21:10:43.580112 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f58p7" Dec 01 21:10:43 crc kubenswrapper[4802]: I1201 21:10:43.628192 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f58p7" Dec 01 21:10:43 crc kubenswrapper[4802]: I1201 21:10:43.770695 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xg76d" Dec 01 21:10:43 crc kubenswrapper[4802]: I1201 21:10:43.770791 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xg76d" Dec 01 21:10:43 crc kubenswrapper[4802]: I1201 21:10:43.850882 4802 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xg76d" Dec 01 21:10:44 crc kubenswrapper[4802]: I1201 21:10:44.525356 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xg76d" Dec 01 21:10:44 crc kubenswrapper[4802]: I1201 21:10:44.561670 4802 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f58p7" Dec 01 21:10:45 crc kubenswrapper[4802]: I1201 21:10:45.097479 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xg76d"] Dec 01 21:10:46 crc kubenswrapper[4802]: I1201 21:10:46.494479 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xg76d" podUID="ee8ee239-57ba-4dd4-8832-da2bbb5cdf69" containerName="registry-server" containerID="cri-o://654a00e8346b2257e4e6b6210245ca45dbf62d4ae84625e290a74251ae6f0a6b" gracePeriod=2 Dec 01 21:10:46 crc kubenswrapper[4802]: I1201 21:10:46.863474 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f58p7"] Dec 01 21:10:46 crc kubenswrapper[4802]: I1201 21:10:46.864062 4802 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f58p7" podUID="91848b70-1bb1-4b7c-8996-754320f6e26d" containerName="registry-server" containerID="cri-o://113f8d87e649b350f60d53456b610b6f9b0380f18dcfa40c5588337cdeeefa3e" gracePeriod=2 Dec 01 21:10:47 crc kubenswrapper[4802]: E1201 21:10:47.084755 4802 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91848b70_1bb1_4b7c_8996_754320f6e26d.slice/crio-conmon-113f8d87e649b350f60d53456b610b6f9b0380f18dcfa40c5588337cdeeefa3e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91848b70_1bb1_4b7c_8996_754320f6e26d.slice/crio-113f8d87e649b350f60d53456b610b6f9b0380f18dcfa40c5588337cdeeefa3e.scope\": RecentStats: unable to find data in memory cache]" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.318118 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f58p7" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.430630 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xg76d" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.461432 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91848b70-1bb1-4b7c-8996-754320f6e26d-utilities\") pod \"91848b70-1bb1-4b7c-8996-754320f6e26d\" (UID: \"91848b70-1bb1-4b7c-8996-754320f6e26d\") " Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.461589 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91848b70-1bb1-4b7c-8996-754320f6e26d-catalog-content\") pod \"91848b70-1bb1-4b7c-8996-754320f6e26d\" (UID: \"91848b70-1bb1-4b7c-8996-754320f6e26d\") " Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.461643 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89r7t\" (UniqueName: \"kubernetes.io/projected/91848b70-1bb1-4b7c-8996-754320f6e26d-kube-api-access-89r7t\") pod \"91848b70-1bb1-4b7c-8996-754320f6e26d\" (UID: \"91848b70-1bb1-4b7c-8996-754320f6e26d\") " Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.462314 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91848b70-1bb1-4b7c-8996-754320f6e26d-utilities" (OuterVolumeSpecName: "utilities") pod "91848b70-1bb1-4b7c-8996-754320f6e26d" (UID: "91848b70-1bb1-4b7c-8996-754320f6e26d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.467380 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91848b70-1bb1-4b7c-8996-754320f6e26d-kube-api-access-89r7t" (OuterVolumeSpecName: "kube-api-access-89r7t") pod "91848b70-1bb1-4b7c-8996-754320f6e26d" (UID: "91848b70-1bb1-4b7c-8996-754320f6e26d"). InnerVolumeSpecName "kube-api-access-89r7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.506464 4802 generic.go:334] "Generic (PLEG): container finished" podID="ee8ee239-57ba-4dd4-8832-da2bbb5cdf69" containerID="654a00e8346b2257e4e6b6210245ca45dbf62d4ae84625e290a74251ae6f0a6b" exitCode=0 Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.506572 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xg76d" event={"ID":"ee8ee239-57ba-4dd4-8832-da2bbb5cdf69","Type":"ContainerDied","Data":"654a00e8346b2257e4e6b6210245ca45dbf62d4ae84625e290a74251ae6f0a6b"} Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.506611 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xg76d" event={"ID":"ee8ee239-57ba-4dd4-8832-da2bbb5cdf69","Type":"ContainerDied","Data":"563dfa04e1a8ba957c9ecdec50ed1638d373ce0a52443fabed5ab861fb5e1e53"} Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.506630 4802 scope.go:117] "RemoveContainer" containerID="654a00e8346b2257e4e6b6210245ca45dbf62d4ae84625e290a74251ae6f0a6b" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.507593 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xg76d" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.510003 4802 generic.go:334] "Generic (PLEG): container finished" podID="91848b70-1bb1-4b7c-8996-754320f6e26d" containerID="113f8d87e649b350f60d53456b610b6f9b0380f18dcfa40c5588337cdeeefa3e" exitCode=0 Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.510036 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f58p7" event={"ID":"91848b70-1bb1-4b7c-8996-754320f6e26d","Type":"ContainerDied","Data":"113f8d87e649b350f60d53456b610b6f9b0380f18dcfa40c5588337cdeeefa3e"} Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.510063 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f58p7" event={"ID":"91848b70-1bb1-4b7c-8996-754320f6e26d","Type":"ContainerDied","Data":"ba808bbdd7530989143ddb77231699c9fc8ce13e09be9c800cf78ab45e0ecb65"} Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.510273 4802 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f58p7" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.519465 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91848b70-1bb1-4b7c-8996-754320f6e26d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "91848b70-1bb1-4b7c-8996-754320f6e26d" (UID: "91848b70-1bb1-4b7c-8996-754320f6e26d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.525186 4802 scope.go:117] "RemoveContainer" containerID="87c90ff0170ffb4ca6191b159e1779ebf6ea7a646cbf73bcb53f452ee4c8e2c2" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.551730 4802 scope.go:117] "RemoveContainer" containerID="de4466b05d0f42afbf25a488510854348ba34f5c5b17d4e37d46bf4a369edb35" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.562777 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee8ee239-57ba-4dd4-8832-da2bbb5cdf69-utilities\") pod \"ee8ee239-57ba-4dd4-8832-da2bbb5cdf69\" (UID: \"ee8ee239-57ba-4dd4-8832-da2bbb5cdf69\") " Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.562935 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee8ee239-57ba-4dd4-8832-da2bbb5cdf69-catalog-content\") pod \"ee8ee239-57ba-4dd4-8832-da2bbb5cdf69\" (UID: \"ee8ee239-57ba-4dd4-8832-da2bbb5cdf69\") " Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.562999 4802 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5fc5\" (UniqueName: \"kubernetes.io/projected/ee8ee239-57ba-4dd4-8832-da2bbb5cdf69-kube-api-access-f5fc5\") pod \"ee8ee239-57ba-4dd4-8832-da2bbb5cdf69\" (UID: \"ee8ee239-57ba-4dd4-8832-da2bbb5cdf69\") " Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.563557 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91848b70-1bb1-4b7c-8996-754320f6e26d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.563580 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89r7t\" (UniqueName: \"kubernetes.io/projected/91848b70-1bb1-4b7c-8996-754320f6e26d-kube-api-access-89r7t\") on node \"crc\" DevicePath \"\"" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.563611 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91848b70-1bb1-4b7c-8996-754320f6e26d-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.564480 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee8ee239-57ba-4dd4-8832-da2bbb5cdf69-utilities" (OuterVolumeSpecName: "utilities") pod "ee8ee239-57ba-4dd4-8832-da2bbb5cdf69" (UID: "ee8ee239-57ba-4dd4-8832-da2bbb5cdf69"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.569397 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee8ee239-57ba-4dd4-8832-da2bbb5cdf69-kube-api-access-f5fc5" (OuterVolumeSpecName: "kube-api-access-f5fc5") pod "ee8ee239-57ba-4dd4-8832-da2bbb5cdf69" (UID: "ee8ee239-57ba-4dd4-8832-da2bbb5cdf69"). InnerVolumeSpecName "kube-api-access-f5fc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.574889 4802 scope.go:117] "RemoveContainer" containerID="654a00e8346b2257e4e6b6210245ca45dbf62d4ae84625e290a74251ae6f0a6b" Dec 01 21:10:47 crc kubenswrapper[4802]: E1201 21:10:47.581552 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"654a00e8346b2257e4e6b6210245ca45dbf62d4ae84625e290a74251ae6f0a6b\": container with ID starting with 654a00e8346b2257e4e6b6210245ca45dbf62d4ae84625e290a74251ae6f0a6b not found: ID does not exist" containerID="654a00e8346b2257e4e6b6210245ca45dbf62d4ae84625e290a74251ae6f0a6b" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.581606 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"654a00e8346b2257e4e6b6210245ca45dbf62d4ae84625e290a74251ae6f0a6b"} err="failed to get container status \"654a00e8346b2257e4e6b6210245ca45dbf62d4ae84625e290a74251ae6f0a6b\": rpc error: code = NotFound desc = could not find container \"654a00e8346b2257e4e6b6210245ca45dbf62d4ae84625e290a74251ae6f0a6b\": container with ID starting with 654a00e8346b2257e4e6b6210245ca45dbf62d4ae84625e290a74251ae6f0a6b not found: ID does not exist" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.581633 4802 scope.go:117] "RemoveContainer" containerID="87c90ff0170ffb4ca6191b159e1779ebf6ea7a646cbf73bcb53f452ee4c8e2c2" Dec 01 21:10:47 crc kubenswrapper[4802]: E1201 21:10:47.584297 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87c90ff0170ffb4ca6191b159e1779ebf6ea7a646cbf73bcb53f452ee4c8e2c2\": container with ID starting with 87c90ff0170ffb4ca6191b159e1779ebf6ea7a646cbf73bcb53f452ee4c8e2c2 not found: ID does not exist" containerID="87c90ff0170ffb4ca6191b159e1779ebf6ea7a646cbf73bcb53f452ee4c8e2c2" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.584340 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87c90ff0170ffb4ca6191b159e1779ebf6ea7a646cbf73bcb53f452ee4c8e2c2"} err="failed to get container status \"87c90ff0170ffb4ca6191b159e1779ebf6ea7a646cbf73bcb53f452ee4c8e2c2\": rpc error: code = NotFound desc = could not find container \"87c90ff0170ffb4ca6191b159e1779ebf6ea7a646cbf73bcb53f452ee4c8e2c2\": container with ID starting with 87c90ff0170ffb4ca6191b159e1779ebf6ea7a646cbf73bcb53f452ee4c8e2c2 not found: ID does not exist" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.584375 4802 scope.go:117] "RemoveContainer" containerID="de4466b05d0f42afbf25a488510854348ba34f5c5b17d4e37d46bf4a369edb35" Dec 01 21:10:47 crc kubenswrapper[4802]: E1201 21:10:47.586705 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de4466b05d0f42afbf25a488510854348ba34f5c5b17d4e37d46bf4a369edb35\": container with ID starting with de4466b05d0f42afbf25a488510854348ba34f5c5b17d4e37d46bf4a369edb35 not found: ID does not exist" containerID="de4466b05d0f42afbf25a488510854348ba34f5c5b17d4e37d46bf4a369edb35" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.586759 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de4466b05d0f42afbf25a488510854348ba34f5c5b17d4e37d46bf4a369edb35"} err="failed to get container status \"de4466b05d0f42afbf25a488510854348ba34f5c5b17d4e37d46bf4a369edb35\": rpc error: code = NotFound desc = could not find container \"de4466b05d0f42afbf25a488510854348ba34f5c5b17d4e37d46bf4a369edb35\": container with ID starting with de4466b05d0f42afbf25a488510854348ba34f5c5b17d4e37d46bf4a369edb35 not found: ID does not exist" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.586788 4802 scope.go:117] "RemoveContainer" containerID="113f8d87e649b350f60d53456b610b6f9b0380f18dcfa40c5588337cdeeefa3e" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.621239 4802 scope.go:117] "RemoveContainer" containerID="e58e03c1c2746f74c1f341f8703a74b15d593cd677f2b6995a5440c06d11cf3a" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.664854 4802 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5fc5\" (UniqueName: \"kubernetes.io/projected/ee8ee239-57ba-4dd4-8832-da2bbb5cdf69-kube-api-access-f5fc5\") on node \"crc\" DevicePath \"\"" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.664891 4802 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee8ee239-57ba-4dd4-8832-da2bbb5cdf69-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.667647 4802 scope.go:117] "RemoveContainer" containerID="6deebd58b018e335f6a7db1e3b0b2ecaaae93a7fd98e13cd6f69d223c4a1b5b8" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.714015 4802 scope.go:117] "RemoveContainer" containerID="113f8d87e649b350f60d53456b610b6f9b0380f18dcfa40c5588337cdeeefa3e" Dec 01 21:10:47 crc kubenswrapper[4802]: E1201 21:10:47.714596 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"113f8d87e649b350f60d53456b610b6f9b0380f18dcfa40c5588337cdeeefa3e\": container with ID starting with 113f8d87e649b350f60d53456b610b6f9b0380f18dcfa40c5588337cdeeefa3e not found: ID does not exist" containerID="113f8d87e649b350f60d53456b610b6f9b0380f18dcfa40c5588337cdeeefa3e" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.714638 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"113f8d87e649b350f60d53456b610b6f9b0380f18dcfa40c5588337cdeeefa3e"} err="failed to get container status \"113f8d87e649b350f60d53456b610b6f9b0380f18dcfa40c5588337cdeeefa3e\": rpc error: code = NotFound desc = could not find container \"113f8d87e649b350f60d53456b610b6f9b0380f18dcfa40c5588337cdeeefa3e\": container with ID starting with 113f8d87e649b350f60d53456b610b6f9b0380f18dcfa40c5588337cdeeefa3e not found: ID does not exist" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.714664 4802 scope.go:117] "RemoveContainer" containerID="e58e03c1c2746f74c1f341f8703a74b15d593cd677f2b6995a5440c06d11cf3a" Dec 01 21:10:47 crc kubenswrapper[4802]: E1201 21:10:47.714966 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e58e03c1c2746f74c1f341f8703a74b15d593cd677f2b6995a5440c06d11cf3a\": container with ID starting with e58e03c1c2746f74c1f341f8703a74b15d593cd677f2b6995a5440c06d11cf3a not found: ID does not exist" containerID="e58e03c1c2746f74c1f341f8703a74b15d593cd677f2b6995a5440c06d11cf3a" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.714990 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e58e03c1c2746f74c1f341f8703a74b15d593cd677f2b6995a5440c06d11cf3a"} err="failed to get container status \"e58e03c1c2746f74c1f341f8703a74b15d593cd677f2b6995a5440c06d11cf3a\": rpc error: code = NotFound desc = could not find container \"e58e03c1c2746f74c1f341f8703a74b15d593cd677f2b6995a5440c06d11cf3a\": container with ID starting with e58e03c1c2746f74c1f341f8703a74b15d593cd677f2b6995a5440c06d11cf3a not found: ID does not exist" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.715004 4802 scope.go:117] "RemoveContainer" containerID="6deebd58b018e335f6a7db1e3b0b2ecaaae93a7fd98e13cd6f69d223c4a1b5b8" Dec 01 21:10:47 crc kubenswrapper[4802]: E1201 21:10:47.715293 4802 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6deebd58b018e335f6a7db1e3b0b2ecaaae93a7fd98e13cd6f69d223c4a1b5b8\": container with ID starting with 6deebd58b018e335f6a7db1e3b0b2ecaaae93a7fd98e13cd6f69d223c4a1b5b8 not found: ID does not exist" containerID="6deebd58b018e335f6a7db1e3b0b2ecaaae93a7fd98e13cd6f69d223c4a1b5b8" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.715340 4802 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6deebd58b018e335f6a7db1e3b0b2ecaaae93a7fd98e13cd6f69d223c4a1b5b8"} err="failed to get container status \"6deebd58b018e335f6a7db1e3b0b2ecaaae93a7fd98e13cd6f69d223c4a1b5b8\": rpc error: code = NotFound desc = could not find container \"6deebd58b018e335f6a7db1e3b0b2ecaaae93a7fd98e13cd6f69d223c4a1b5b8\": container with ID starting with 6deebd58b018e335f6a7db1e3b0b2ecaaae93a7fd98e13cd6f69d223c4a1b5b8 not found: ID does not exist" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.854117 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f58p7"] Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.862735 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f58p7"] Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.921700 4802 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee8ee239-57ba-4dd4-8832-da2bbb5cdf69-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee8ee239-57ba-4dd4-8832-da2bbb5cdf69" (UID: "ee8ee239-57ba-4dd4-8832-da2bbb5cdf69"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 21:10:47 crc kubenswrapper[4802]: I1201 21:10:47.970770 4802 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee8ee239-57ba-4dd4-8832-da2bbb5cdf69-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 21:10:48 crc kubenswrapper[4802]: I1201 21:10:48.143164 4802 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xg76d"] Dec 01 21:10:48 crc kubenswrapper[4802]: I1201 21:10:48.151391 4802 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xg76d"] Dec 01 21:10:48 crc kubenswrapper[4802]: I1201 21:10:48.732150 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91848b70-1bb1-4b7c-8996-754320f6e26d" path="/var/lib/kubelet/pods/91848b70-1bb1-4b7c-8996-754320f6e26d/volumes" Dec 01 21:10:48 crc kubenswrapper[4802]: I1201 21:10:48.733222 4802 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee8ee239-57ba-4dd4-8832-da2bbb5cdf69" path="/var/lib/kubelet/pods/ee8ee239-57ba-4dd4-8832-da2bbb5cdf69/volumes" Dec 01 21:10:54 crc kubenswrapper[4802]: I1201 21:10:54.720626 4802 scope.go:117] "RemoveContainer" containerID="8c9005f4cf58b22fc1828c37547de2dc97ee764ad4fa168137785ace7fe8bed0" Dec 01 21:10:54 crc kubenswrapper[4802]: E1201 21:10:54.721993 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 21:11:09 crc kubenswrapper[4802]: I1201 21:11:09.720518 4802 scope.go:117] "RemoveContainer" containerID="8c9005f4cf58b22fc1828c37547de2dc97ee764ad4fa168137785ace7fe8bed0" Dec 01 21:11:09 crc kubenswrapper[4802]: E1201 21:11:09.721494 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 21:11:22 crc kubenswrapper[4802]: I1201 21:11:22.720578 4802 scope.go:117] "RemoveContainer" containerID="8c9005f4cf58b22fc1828c37547de2dc97ee764ad4fa168137785ace7fe8bed0" Dec 01 21:11:22 crc kubenswrapper[4802]: E1201 21:11:22.721871 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c" Dec 01 21:11:32 crc kubenswrapper[4802]: I1201 21:11:32.055480 4802 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zb699"] Dec 01 21:11:32 crc kubenswrapper[4802]: E1201 21:11:32.056993 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee8ee239-57ba-4dd4-8832-da2bbb5cdf69" containerName="extract-content" Dec 01 21:11:32 crc kubenswrapper[4802]: I1201 21:11:32.057016 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee8ee239-57ba-4dd4-8832-da2bbb5cdf69" containerName="extract-content" Dec 01 21:11:32 crc kubenswrapper[4802]: E1201 21:11:32.057045 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee8ee239-57ba-4dd4-8832-da2bbb5cdf69" containerName="extract-utilities" Dec 01 21:11:32 crc kubenswrapper[4802]: I1201 21:11:32.057059 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee8ee239-57ba-4dd4-8832-da2bbb5cdf69" containerName="extract-utilities" Dec 01 21:11:32 crc kubenswrapper[4802]: E1201 21:11:32.057095 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91848b70-1bb1-4b7c-8996-754320f6e26d" containerName="registry-server" Dec 01 21:11:32 crc kubenswrapper[4802]: I1201 21:11:32.057108 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="91848b70-1bb1-4b7c-8996-754320f6e26d" containerName="registry-server" Dec 01 21:11:32 crc kubenswrapper[4802]: E1201 21:11:32.057151 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91848b70-1bb1-4b7c-8996-754320f6e26d" containerName="extract-utilities" Dec 01 21:11:32 crc kubenswrapper[4802]: I1201 21:11:32.057162 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="91848b70-1bb1-4b7c-8996-754320f6e26d" containerName="extract-utilities" Dec 01 21:11:32 crc kubenswrapper[4802]: E1201 21:11:32.057187 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91848b70-1bb1-4b7c-8996-754320f6e26d" containerName="extract-content" Dec 01 21:11:32 crc kubenswrapper[4802]: I1201 21:11:32.057275 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="91848b70-1bb1-4b7c-8996-754320f6e26d" containerName="extract-content" Dec 01 21:11:32 crc kubenswrapper[4802]: E1201 21:11:32.057307 4802 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee8ee239-57ba-4dd4-8832-da2bbb5cdf69" containerName="registry-server" Dec 01 21:11:32 crc kubenswrapper[4802]: I1201 21:11:32.057318 4802 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee8ee239-57ba-4dd4-8832-da2bbb5cdf69" containerName="registry-server" Dec 01 21:11:32 crc kubenswrapper[4802]: I1201 21:11:32.057684 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="91848b70-1bb1-4b7c-8996-754320f6e26d" containerName="registry-server" Dec 01 21:11:32 crc kubenswrapper[4802]: I1201 21:11:32.057713 4802 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee8ee239-57ba-4dd4-8832-da2bbb5cdf69" containerName="registry-server" Dec 01 21:11:32 crc kubenswrapper[4802]: I1201 21:11:32.061625 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zb699" Dec 01 21:11:32 crc kubenswrapper[4802]: I1201 21:11:32.114990 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zb699"] Dec 01 21:11:32 crc kubenswrapper[4802]: I1201 21:11:32.159898 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/749d091b-6d49-4b3c-b5ff-809bcafd918c-utilities\") pod \"redhat-marketplace-zb699\" (UID: \"749d091b-6d49-4b3c-b5ff-809bcafd918c\") " pod="openshift-marketplace/redhat-marketplace-zb699" Dec 01 21:11:32 crc kubenswrapper[4802]: I1201 21:11:32.160390 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkxsh\" (UniqueName: \"kubernetes.io/projected/749d091b-6d49-4b3c-b5ff-809bcafd918c-kube-api-access-lkxsh\") pod \"redhat-marketplace-zb699\" (UID: \"749d091b-6d49-4b3c-b5ff-809bcafd918c\") " pod="openshift-marketplace/redhat-marketplace-zb699" Dec 01 21:11:32 crc kubenswrapper[4802]: I1201 21:11:32.160452 4802 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/749d091b-6d49-4b3c-b5ff-809bcafd918c-catalog-content\") pod \"redhat-marketplace-zb699\" (UID: \"749d091b-6d49-4b3c-b5ff-809bcafd918c\") " pod="openshift-marketplace/redhat-marketplace-zb699" Dec 01 21:11:32 crc kubenswrapper[4802]: I1201 21:11:32.262571 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/749d091b-6d49-4b3c-b5ff-809bcafd918c-catalog-content\") pod \"redhat-marketplace-zb699\" (UID: \"749d091b-6d49-4b3c-b5ff-809bcafd918c\") " pod="openshift-marketplace/redhat-marketplace-zb699" Dec 01 21:11:32 crc kubenswrapper[4802]: I1201 21:11:32.262764 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/749d091b-6d49-4b3c-b5ff-809bcafd918c-utilities\") pod \"redhat-marketplace-zb699\" (UID: \"749d091b-6d49-4b3c-b5ff-809bcafd918c\") " pod="openshift-marketplace/redhat-marketplace-zb699" Dec 01 21:11:32 crc kubenswrapper[4802]: I1201 21:11:32.262806 4802 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkxsh\" (UniqueName: \"kubernetes.io/projected/749d091b-6d49-4b3c-b5ff-809bcafd918c-kube-api-access-lkxsh\") pod \"redhat-marketplace-zb699\" (UID: \"749d091b-6d49-4b3c-b5ff-809bcafd918c\") " pod="openshift-marketplace/redhat-marketplace-zb699" Dec 01 21:11:32 crc kubenswrapper[4802]: I1201 21:11:32.263625 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/749d091b-6d49-4b3c-b5ff-809bcafd918c-catalog-content\") pod \"redhat-marketplace-zb699\" (UID: \"749d091b-6d49-4b3c-b5ff-809bcafd918c\") " pod="openshift-marketplace/redhat-marketplace-zb699" Dec 01 21:11:32 crc kubenswrapper[4802]: I1201 21:11:32.264218 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/749d091b-6d49-4b3c-b5ff-809bcafd918c-utilities\") pod \"redhat-marketplace-zb699\" (UID: \"749d091b-6d49-4b3c-b5ff-809bcafd918c\") " pod="openshift-marketplace/redhat-marketplace-zb699" Dec 01 21:11:32 crc kubenswrapper[4802]: I1201 21:11:32.285711 4802 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkxsh\" (UniqueName: \"kubernetes.io/projected/749d091b-6d49-4b3c-b5ff-809bcafd918c-kube-api-access-lkxsh\") pod \"redhat-marketplace-zb699\" (UID: \"749d091b-6d49-4b3c-b5ff-809bcafd918c\") " pod="openshift-marketplace/redhat-marketplace-zb699" Dec 01 21:11:32 crc kubenswrapper[4802]: I1201 21:11:32.454764 4802 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zb699" Dec 01 21:11:32 crc kubenswrapper[4802]: I1201 21:11:32.942389 4802 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zb699"] Dec 01 21:11:33 crc kubenswrapper[4802]: I1201 21:11:33.009184 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zb699" event={"ID":"749d091b-6d49-4b3c-b5ff-809bcafd918c","Type":"ContainerStarted","Data":"c1e97ee12b0f42a48118bec01ac807a36124bc93f9630ad6692f0955d6dcce11"} Dec 01 21:11:34 crc kubenswrapper[4802]: I1201 21:11:34.026162 4802 generic.go:334] "Generic (PLEG): container finished" podID="749d091b-6d49-4b3c-b5ff-809bcafd918c" containerID="ece1ec175eef2a99a3ea8251cf6faad5c5126525687652e0d946fc7da2e36af3" exitCode=0 Dec 01 21:11:34 crc kubenswrapper[4802]: I1201 21:11:34.026325 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zb699" event={"ID":"749d091b-6d49-4b3c-b5ff-809bcafd918c","Type":"ContainerDied","Data":"ece1ec175eef2a99a3ea8251cf6faad5c5126525687652e0d946fc7da2e36af3"} Dec 01 21:11:35 crc kubenswrapper[4802]: I1201 21:11:35.040751 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zb699" event={"ID":"749d091b-6d49-4b3c-b5ff-809bcafd918c","Type":"ContainerStarted","Data":"b6ca2bab1f7845297b38ec5348ba3992db123708e8fedd2e7503b307fe486174"} Dec 01 21:11:36 crc kubenswrapper[4802]: I1201 21:11:36.057251 4802 generic.go:334] "Generic (PLEG): container finished" podID="749d091b-6d49-4b3c-b5ff-809bcafd918c" containerID="b6ca2bab1f7845297b38ec5348ba3992db123708e8fedd2e7503b307fe486174" exitCode=0 Dec 01 21:11:36 crc kubenswrapper[4802]: I1201 21:11:36.057300 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zb699" event={"ID":"749d091b-6d49-4b3c-b5ff-809bcafd918c","Type":"ContainerDied","Data":"b6ca2bab1f7845297b38ec5348ba3992db123708e8fedd2e7503b307fe486174"} Dec 01 21:11:36 crc kubenswrapper[4802]: I1201 21:11:36.057627 4802 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zb699" event={"ID":"749d091b-6d49-4b3c-b5ff-809bcafd918c","Type":"ContainerStarted","Data":"9e743abf4b2a0775b217e7e477c15e33ac74c8d99afa9f79d8bff40d61184102"} Dec 01 21:11:36 crc kubenswrapper[4802]: I1201 21:11:36.088776 4802 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zb699" podStartSLOduration=2.651776074 podStartE2EDuration="4.088750075s" podCreationTimestamp="2025-12-01 21:11:32 +0000 UTC" firstStartedPulling="2025-12-01 21:11:34.029425471 +0000 UTC m=+4515.591985132" lastFinishedPulling="2025-12-01 21:11:35.466399482 +0000 UTC m=+4517.028959133" observedRunningTime="2025-12-01 21:11:36.074765377 +0000 UTC m=+4517.637325028" watchObservedRunningTime="2025-12-01 21:11:36.088750075 +0000 UTC m=+4517.651309726" Dec 01 21:11:36 crc kubenswrapper[4802]: I1201 21:11:36.723834 4802 scope.go:117] "RemoveContainer" containerID="8c9005f4cf58b22fc1828c37547de2dc97ee764ad4fa168137785ace7fe8bed0" Dec 01 21:11:36 crc kubenswrapper[4802]: E1201 21:11:36.724228 4802 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tw4xd_openshift-machine-config-operator(23e1ef99-f507-42ea-a076-4fc1681c7e8c)\"" pod="openshift-machine-config-operator/machine-config-daemon-tw4xd" podUID="23e1ef99-f507-42ea-a076-4fc1681c7e8c"